Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release v0.11.0 #291

Merged
merged 1 commit into from
Sep 18, 2024
Merged

Release v0.11.0 #291

merged 1 commit into from
Sep 18, 2024

Conversation

nfx
Copy link
Contributor

@nfx nfx commented Sep 18, 2024

  • Added filter spec implementation (#276). In this commit, a new FilterHandler class has been introduced to handle filter files with the suffix .filter.json, which can parse filter specifications in the header of the filter file and validate the filter columns and types. The commit also adds support for three types of filters: DATE_RANGE_PICKER, MULTI_SELECT, and DROPDOWN, which can be linked with multiple visualization widgets. Additionally, a FilterTile class has been added to the Tile class, which represents a filter tile in the dashboard and includes methods to validate the tile, create widgets, and generate filter encodings and queries. The DashboardMetadata class has been updated to include a new method get_datasets() to retrieve the datasets for the dashboard. These changes enhance the functionality of the dashboard by adding support for filtering data using various filter types and linking them with multiple visualization widgets, improving the customization and interactivity of the dashboard, and making it more user-friendly and efficient.
  • Bugfix: MockBackend wasn't mocking savetable properly when the mode is append (#289). This release includes a bugfix and enhancements for the MockBackend component, which is used to mock the SQLBackend. The .savetable() method failed to function as expected in append mode, writing all rows to the same table instead of accumulating them. This bug has been addressed, ensuring that rows accumulate correctly in append mode. Additionally, a new test function, test_mock_backend_save_table_overwrite(), has been added to demonstrate the corrected behavior of overwrite mode, showing that it now replaces only the existing rows for the given table while preserving other tables' contents. The type signature for .save_table() has been updated, restricting the mode parameter to accept only two string literals: "append" and "overwrite". The MockBackend behavior has been updated accordingly, and rows are now filtered to exclude any None or NULL values prior to saving. These improvements to the MockBackend functionality and test suite increase reliability when using the MockBackend as a testing backend for the system.
  • Changed filter spec to use YML instead of JSON (#290). In this release, the filter specification files have been converted from JSON to YAML format, providing a more human-readable format for the filter specifications. The schema for the filter file includes flags for column, columns, type, title, description, order, and id, with the type flag taking on values of DROPDOWN, MULTI_SELECT, or DATE_RANGE_PICKER. This change impacts the FilterHandler, is_filter method, and _from_dashboard_folder method, as well as relevant parts of the documentation. Additionally, the parsing methods have been updated to use yaml.safe_load instead of json.loads, and the is_filter method now checks for .filter.yml suffix. A new file, '00_0_date.filter.yml', has been added to the 'tests/integration/dashboards/filter_spec_basic' directory, containing a sample date filter definition. Furthermore, various tests have been added to validate filter specifications, such as checking for invalid type and both column and columns keys being present. These updates aim to enhance readability, maintainability, and ease of use for filter configuration.
  • Increase testing of generic types storage (#282). A new commit enhances the testing of generic types storage by expanding the test suite to include a list of structs, ensuring more comprehensive testing of the system. The Foo struct has been renamed to Nested for clarity, and two new structs, NestedWithDict and Nesting, have been added. The Nesting struct contains a Nested object, while NestedWithDict includes a string and an optional dictionary of strings. A new test case demonstrates appending complex types to a table by creating and saving a table with two rows, each containing a Nesting struct. The test then fetches the data and asserts the expected number of rows are returned, ensuring the proper functioning of the storage system with complex data types.
  • Minor Changes to avoid redundancy in code and follow code patterns (#279). In this release, we have made significant improvements to the dashboards.py file to make the code more concise, maintainable, and in line with the standard library's recommended usage. The export_to_zipped_csv method has undergone major changes, including the removal of the BytesIO module import and the use of StringIO for handling strings as files. The method no longer creates a separate ZIP file for the CSV files, instead using the provided export_path. Additionally, the method skips tiles that don't contain queries. We have also introduced a new method, dataclass_transform, which transforms a given dataclass into a new one with specific attributes and behavior. This method creates a new dataclass with a custom metaclass and adds a new method, to_dict(), which converts the instances of the new dataclass to dictionaries. These changes promote code reusability and reduce redundancy in the codebase, making it easier for software engineers to work with.
  • New example with bar chart in dashboards-as-code (#281). A new example of a dashboard featuring a bar chart has been added to the dashboards-as-code feature using the existing metadata overrides feature to support the new widget type, without bloating the TileMetadata structure. An integration test was added to demonstrate the creation of a bar chart, and the resulting dashboard can be seen in the attached screenshot. Additionally, a new SQL file has been added for the Product Sales dashboard, showcasing sales data for different product categories. This approach can potentially be used to support other widget types such as Bar, Pivot, Area, etc. The team is encouraged to provide feedback on this proposed solution.

* Added filter spec implementation ([#276](#276)). In this commit, a new `FilterHandler` class has been introduced to handle filter files with the suffix `.filter.json`, which can parse filter specifications in the header of the filter file and validate the filter columns and types. The commit also adds support for three types of filters: `DATE_RANGE_PICKER`, `MULTI_SELECT`, and `DROPDOWN`, which can be linked with multiple visualization widgets. Additionally, a `FilterTile` class has been added to the `Tile` class, which represents a filter tile in the dashboard and includes methods to validate the tile, create widgets, and generate filter encodings and queries. The `DashboardMetadata` class has been updated to include a new method `get_datasets()` to retrieve the datasets for the dashboard. These changes enhance the functionality of the dashboard by adding support for filtering data using various filter types and linking them with multiple visualization widgets, improving the customization and interactivity of the dashboard, and making it more user-friendly and efficient.
* Bugfix: `MockBackend` wasn't mocking `savetable` properly when the mode is `append` ([#289](#289)). This release includes a bugfix and enhancements for the `MockBackend` component, which is used to mock the `SQLBackend`. The `.savetable()` method failed to function as expected in `append` mode, writing all rows to the same table instead of accumulating them. This bug has been addressed, ensuring that rows accumulate correctly in `append` mode. Additionally, a new test function, `test_mock_backend_save_table_overwrite()`, has been added to demonstrate the corrected behavior of `overwrite` mode, showing that it now replaces only the existing rows for the given table while preserving other tables' contents. The type signature for `.save_table()` has been updated, restricting the `mode` parameter to accept only two string literals: `"append"` and `"overwrite"`. The `MockBackend` behavior has been updated accordingly, and rows are now filtered to exclude any `None` or `NULL` values prior to saving. These improvements to the `MockBackend` functionality and test suite increase reliability when using the `MockBackend` as a testing backend for the system.
* Changed filter spec to use YML instead of JSON ([#290](#290)). In this release, the filter specification files have been converted from JSON to YAML format, providing a more human-readable format for the filter specifications. The schema for the filter file includes flags for column, columns, type, title, description, order, and id, with the type flag taking on values of DROPDOWN, MULTI_SELECT, or DATE_RANGE_PICKER. This change impacts the FilterHandler, is_filter method, and _from_dashboard_folder method, as well as relevant parts of the documentation. Additionally, the parsing methods have been updated to use yaml.safe_load instead of json.loads, and the is_filter method now checks for .filter.yml suffix. A new file, '00_0_date.filter.yml', has been added to the 'tests/integration/dashboards/filter_spec_basic' directory, containing a sample date filter definition. Furthermore, various tests have been added to validate filter specifications, such as checking for invalid type and both `column` and `columns` keys being present. These updates aim to enhance readability, maintainability, and ease of use for filter configuration.
* Increase testing of generic types storage ([#282](#282)). A new commit enhances the testing of generic types storage by expanding the test suite to include a list of structs, ensuring more comprehensive testing of the system. The `Foo` struct has been renamed to `Nested` for clarity, and two new structs, `NestedWithDict` and `Nesting`, have been added. The `Nesting` struct contains a `Nested` object, while `NestedWithDict` includes a string and an optional dictionary of strings. A new test case demonstrates appending complex types to a table by creating and saving a table with two rows, each containing a `Nesting` struct. The test then fetches the data and asserts the expected number of rows are returned, ensuring the proper functioning of the storage system with complex data types.
* Minor Changes to avoid redundancy in code and follow code patterns ([#279](#279)). In this release, we have made significant improvements to the `dashboards.py` file to make the code more concise, maintainable, and in line with the standard library's recommended usage. The `export_to_zipped_csv` method has undergone major changes, including the removal of the `BytesIO` module import and the use of `StringIO` for handling strings as files. The method no longer creates a separate ZIP file for the CSV files, instead using the provided `export_path`. Additionally, the method skips tiles that don't contain queries. We have also introduced a new method, `dataclass_transform`, which transforms a given dataclass into a new one with specific attributes and behavior. This method creates a new dataclass with a custom metaclass and adds a new method, `to_dict()`, which converts the instances of the new dataclass to dictionaries. These changes promote code reusability and reduce redundancy in the codebase, making it easier for software engineers to work with.
* New example with bar chart in dashboards-as-code ([#281](#281)). A new example of a dashboard featuring a bar chart has been added to the `dashboards-as-code` feature using the existing metadata overrides feature to support the new widget type, without bloating the TileMetadata structure. An integration test was added to demonstrate the creation of a bar chart, and the resulting dashboard can be seen in the attached screenshot. Additionally, a new SQL file has been added for the `Product Sales` dashboard, showcasing sales data for different product categories. This approach can potentially be used to support other widget types such as Bar, Pivot, Area, etc. The team is encouraged to provide feedback on this proposed solution.
@nfx nfx merged commit 02ed2c8 into main Sep 18, 2024
6 of 8 checks passed
@nfx nfx deleted the prepare/0.11.0 branch September 18, 2024 08:06
Copy link

❌ 32/35 passed, 3 flaky, 3 failed, 4 skipped, 14m16s total

❌ test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import BadRequest\nbackend = RuntimeBackend()\ntry:\n query_response = backend.fetch("SHWO DTABASES")\n return "FAILED"\nexcept BadRequest:\n return "PASSED"\n]: databricks.sdk.errors.base.DatabricksError: CalledProcessError: Command 'pip --disable-pip-version-check install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.H8ff/wheels/databricks_labs_lsql-0.10.1+1320240918080712-py3-none-any.whl' returned non-zero exit status 1. (1m30.599s)
databricks.sdk.errors.base.DatabricksError: CalledProcessError: Command 'pip --disable-pip-version-check install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.H8ff/wheels/databricks_labs_lsql-0.10.1+1320240918080712-py3-none-any.whl' returned non-zero exit status 1.
08:07 DEBUG [databricks.sdk] Loaded from environment
08:07 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
08:07 INFO [databricks.sdk] Using Databricks Metadata Service authentication
[gw3] linux -- Python 3.10.14 /home/runner/work/lsql/lsql/.venv/bin/python
08:07 DEBUG [databricks.sdk] Loaded from environment
08:07 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
08:07 INFO [databricks.sdk] Using Databricks Metadata Service authentication
08:07 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
08:07 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmpgm4mi8w6/working-copy in /tmp/tmpgm4mi8w6
08:07 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.H8ff/wheels/databricks_labs_lsql-0.10.1+1320240918080712-py3-none-any.whl
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.H8ff/wheels) does not exist."
< }
08:07 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.H8ff/wheels
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.H8ff/wheels"
> }
< 200 OK
< {}
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 1012020733694322
< }
08:07 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
08:07 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.H8ff/version.json
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 1012020733694326
< }
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~1s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~2s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~3s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~4s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~5s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 4.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 16384,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.0.11",
<     "instance_id": "e37d5f4ffd02498bb190a0d4a72c0737",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "560e17f495304afc84798f2c1dd928a6",
<     "private_ip": "10.179.2.11",
<     "public_dns": "",
<     "start_timestamp": 1726646658304
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1726646778279,
<   "last_restarted_time": 1726646849201,
<   "last_state_loss_time": 1726646849144,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 4115190500459142486,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "RUNNING",
<   "state_message": ""
< }
08:07 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "7176030881967954929"
< }
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~3s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~4s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~5s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~6s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~7s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~8s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~9s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (111 more bytes)",
>   "contextId": "7176030881967954929",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "f45363a1de3c46d48e90216b1ac6f64c"
< }
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=f45363a1de3c46d48e90216b1ac6f64c&contextId=7176030881967954929
< 200 OK
< {
<   "id": "f45363a1de3c46d48e90216b1ac6f64c",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=f45363a1de3c46d48e90216b1ac6f64c, context_id=7176030881967954929: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=f45363a1de3c46d48e90216b1ac6f64c&contextId=7176030881967954929
< 200 OK
< {
<   "id": "f45363a1de3c46d48e90216b1ac6f64c",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=f45363a1de3c46d48e90216b1ac6f64c, context_id=7176030881967954929: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=f45363a1de3c46d48e90216b1ac6f64c&contextId=7176030881967954929
< 200 OK
< {
<   "id": "f45363a1de3c46d48e90216b1ac6f64c",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=f45363a1de3c46d48e90216b1ac6f64c, context_id=7176030881967954929: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=f45363a1de3c46d48e90216b1ac6f64c&contextId=7176030881967954929
< 200 OK
< {
<   "id": "f45363a1de3c46d48e90216b1ac6f64c",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=f45363a1de3c46d48e90216b1ac6f64c, context_id=7176030881967954929: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~4s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=f45363a1de3c46d48e90216b1ac6f64c&contextId=7176030881967954929
< 200 OK
< {
<   "id": "f45363a1de3c46d48e90216b1ac6f64c",
<   "results": {
<     "cause": "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m\n\u001b[0;31mCa... (3345 more bytes)",
<     "resultType": "error",
<     "summary": "<span class='ansi-red-fg'>CalledProcessError</span>: Command 'pip --disable-pip-version-check in... (168 more bytes)"
<   },
<   "status": "Finished"
< }
08:07 DEBUG [databricks.sdk] Loaded from environment
08:07 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
08:07 INFO [databricks.sdk] Using Databricks Metadata Service authentication
08:07 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
08:07 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmpgm4mi8w6/working-copy in /tmp/tmpgm4mi8w6
08:07 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.H8ff/wheels/databricks_labs_lsql-0.10.1+1320240918080712-py3-none-any.whl
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.H8ff/wheels) does not exist."
< }
08:07 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.H8ff/wheels
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.H8ff/wheels"
> }
< 200 OK
< {}
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 1012020733694322
< }
08:07 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
08:07 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.H8ff/version.json
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 1012020733694326
< }
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~1s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~2s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~3s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~4s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~5s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 4.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 16384,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.0.11",
<     "instance_id": "e37d5f4ffd02498bb190a0d4a72c0737",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "560e17f495304afc84798f2c1dd928a6",
<     "private_ip": "10.179.2.11",
<     "public_dns": "",
<     "start_timestamp": 1726646658304
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1726646778279,
<   "last_restarted_time": 1726646849201,
<   "last_state_loss_time": 1726646849144,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 4115190500459142486,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "RUNNING",
<   "state_message": ""
< }
08:07 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "7176030881967954929"
< }
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~3s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~4s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~5s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~6s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~7s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~8s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7176030881967954929: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~9s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7176030881967954929
< 200 OK
< {
<   "id": "7176030881967954929",
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (111 more bytes)",
>   "contextId": "7176030881967954929",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "f45363a1de3c46d48e90216b1ac6f64c"
< }
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=f45363a1de3c46d48e90216b1ac6f64c&contextId=7176030881967954929
< 200 OK
< {
<   "id": "f45363a1de3c46d48e90216b1ac6f64c",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=f45363a1de3c46d48e90216b1ac6f64c, context_id=7176030881967954929: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=f45363a1de3c46d48e90216b1ac6f64c&contextId=7176030881967954929
< 200 OK
< {
<   "id": "f45363a1de3c46d48e90216b1ac6f64c",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=f45363a1de3c46d48e90216b1ac6f64c, context_id=7176030881967954929: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=f45363a1de3c46d48e90216b1ac6f64c&contextId=7176030881967954929
< 200 OK
< {
<   "id": "f45363a1de3c46d48e90216b1ac6f64c",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=f45363a1de3c46d48e90216b1ac6f64c, context_id=7176030881967954929: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=f45363a1de3c46d48e90216b1ac6f64c&contextId=7176030881967954929
< 200 OK
< {
<   "id": "f45363a1de3c46d48e90216b1ac6f64c",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=f45363a1de3c46d48e90216b1ac6f64c, context_id=7176030881967954929: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~4s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=f45363a1de3c46d48e90216b1ac6f64c&contextId=7176030881967954929
< 200 OK
< {
<   "id": "f45363a1de3c46d48e90216b1ac6f64c",
<   "results": {
<     "cause": "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m\n\u001b[0;31mCa... (3345 more bytes)",
<     "resultType": "error",
<     "summary": "<span class='ansi-red-fg'>CalledProcessError</span>: Command 'pip --disable-pip-version-check in... (168 more bytes)"
<   },
<   "status": "Finished"
< }
[gw3] linux -- Python 3.10.14 /home/runner/work/lsql/lsql/.venv/bin/python
❌ test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import NotFound\nbackend = RuntimeBackend()\ntry:\n query_response = backend.fetch("SELECT * FROM TEST_SCHEMA.__RANDOM__")\n return "FAILED"\nexcept NotFound as e:\n return "PASSED"\n]: databricks.sdk.errors.base.DatabricksError: CalledProcessError: Command 'pip --disable-pip-version-check install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.9TF8/wheels/databricks_labs_lsql-0.10.1+1320240918080712-py3-none-any.whl' returned non-zero exit status 1. (1m32.681s)
databricks.sdk.errors.base.DatabricksError: CalledProcessError: Command 'pip --disable-pip-version-check install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.9TF8/wheels/databricks_labs_lsql-0.10.1+1320240918080712-py3-none-any.whl' returned non-zero exit status 1.
08:07 DEBUG [databricks.sdk] Loaded from environment
08:07 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
08:07 INFO [databricks.sdk] Using Databricks Metadata Service authentication
[gw2] linux -- Python 3.10.14 /home/runner/work/lsql/lsql/.venv/bin/python
08:07 DEBUG [databricks.sdk] Loaded from environment
08:07 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
08:07 INFO [databricks.sdk] Using Databricks Metadata Service authentication
08:07 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
08:07 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmp_hfibtfb/working-copy in /tmp/tmp_hfibtfb
08:07 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.9TF8/wheels/databricks_labs_lsql-0.10.1+1320240918080712-py3-none-any.whl
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.9TF8/wheels) does not exist."
< }
08:07 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.9TF8/wheels
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.9TF8/wheels"
> }
< 200 OK
< {}
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 1012020733694321
< }
08:07 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
08:07 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.9TF8/version.json
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 1012020733694327
< }
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~1s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~2s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~3s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~4s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~5s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 4.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 16384,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.0.11",
<     "instance_id": "e37d5f4ffd02498bb190a0d4a72c0737",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "560e17f495304afc84798f2c1dd928a6",
<     "private_ip": "10.179.2.11",
<     "public_dns": "",
<     "start_timestamp": 1726646658304
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1726646778279,
<   "last_restarted_time": 1726646849201,
<   "last_state_loss_time": 1726646849144,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 4115190500459142486,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "RUNNING",
<   "state_message": ""
< }
08:07 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "1025565884197132703"
< }
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~3s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~4s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~5s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~6s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~7s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~8s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~9s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (111 more bytes)",
>   "contextId": "1025565884197132703",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "46ba2d7e765e4f089f4d72524962eac0"
< }
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=46ba2d7e765e4f089f4d72524962eac0&contextId=1025565884197132703
< 200 OK
< {
<   "id": "46ba2d7e765e4f089f4d72524962eac0",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=46ba2d7e765e4f089f4d72524962eac0, context_id=1025565884197132703: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=46ba2d7e765e4f089f4d72524962eac0&contextId=1025565884197132703
< 200 OK
< {
<   "id": "46ba2d7e765e4f089f4d72524962eac0",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=46ba2d7e765e4f089f4d72524962eac0, context_id=1025565884197132703: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=46ba2d7e765e4f089f4d72524962eac0&contextId=1025565884197132703
< 200 OK
< {
<   "id": "46ba2d7e765e4f089f4d72524962eac0",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=46ba2d7e765e4f089f4d72524962eac0, context_id=1025565884197132703: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=46ba2d7e765e4f089f4d72524962eac0&contextId=1025565884197132703
< 200 OK
< {
<   "id": "46ba2d7e765e4f089f4d72524962eac0",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=46ba2d7e765e4f089f4d72524962eac0, context_id=1025565884197132703: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~4s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=46ba2d7e765e4f089f4d72524962eac0&contextId=1025565884197132703
< 200 OK
< {
<   "id": "46ba2d7e765e4f089f4d72524962eac0",
<   "results": {
<     "cause": "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m\n\u001b[0;31mCa... (3345 more bytes)",
<     "resultType": "error",
<     "summary": "<span class='ansi-red-fg'>CalledProcessError</span>: Command 'pip --disable-pip-version-check in... (168 more bytes)"
<   },
<   "status": "Finished"
< }
08:07 DEBUG [databricks.sdk] Loaded from environment
08:07 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
08:07 INFO [databricks.sdk] Using Databricks Metadata Service authentication
08:07 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
08:07 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmp_hfibtfb/working-copy in /tmp/tmp_hfibtfb
08:07 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.9TF8/wheels/databricks_labs_lsql-0.10.1+1320240918080712-py3-none-any.whl
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.9TF8/wheels) does not exist."
< }
08:07 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.9TF8/wheels
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.9TF8/wheels"
> }
< 200 OK
< {}
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 1012020733694321
< }
08:07 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
08:07 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.9TF8/version.json
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 1012020733694327
< }
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~1s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~2s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~3s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~4s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~5s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 4.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 16384,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.0.11",
<     "instance_id": "e37d5f4ffd02498bb190a0d4a72c0737",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "560e17f495304afc84798f2c1dd928a6",
<     "private_ip": "10.179.2.11",
<     "public_dns": "",
<     "start_timestamp": 1726646658304
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1726646778279,
<   "last_restarted_time": 1726646849201,
<   "last_state_loss_time": 1726646849144,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 4115190500459142486,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "RUNNING",
<   "state_message": ""
< }
08:07 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "1025565884197132703"
< }
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~3s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~4s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~5s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~6s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~7s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~8s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=1025565884197132703: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~9s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=1025565884197132703
< 200 OK
< {
<   "id": "1025565884197132703",
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (111 more bytes)",
>   "contextId": "1025565884197132703",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "46ba2d7e765e4f089f4d72524962eac0"
< }
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=46ba2d7e765e4f089f4d72524962eac0&contextId=1025565884197132703
< 200 OK
< {
<   "id": "46ba2d7e765e4f089f4d72524962eac0",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=46ba2d7e765e4f089f4d72524962eac0, context_id=1025565884197132703: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=46ba2d7e765e4f089f4d72524962eac0&contextId=1025565884197132703
< 200 OK
< {
<   "id": "46ba2d7e765e4f089f4d72524962eac0",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=46ba2d7e765e4f089f4d72524962eac0, context_id=1025565884197132703: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=46ba2d7e765e4f089f4d72524962eac0&contextId=1025565884197132703
< 200 OK
< {
<   "id": "46ba2d7e765e4f089f4d72524962eac0",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=46ba2d7e765e4f089f4d72524962eac0, context_id=1025565884197132703: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=46ba2d7e765e4f089f4d72524962eac0&contextId=1025565884197132703
< 200 OK
< {
<   "id": "46ba2d7e765e4f089f4d72524962eac0",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=46ba2d7e765e4f089f4d72524962eac0, context_id=1025565884197132703: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~4s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=46ba2d7e765e4f089f4d72524962eac0&contextId=1025565884197132703
< 200 OK
< {
<   "id": "46ba2d7e765e4f089f4d72524962eac0",
<   "results": {
<     "cause": "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m\n\u001b[0;31mCa... (3345 more bytes)",
<     "resultType": "error",
<     "summary": "<span class='ansi-red-fg'>CalledProcessError</span>: Command 'pip --disable-pip-version-check in... (168 more bytes)"
<   },
<   "status": "Finished"
< }
[gw2] linux -- Python 3.10.14 /home/runner/work/lsql/lsql/.venv/bin/python
❌ test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import NotFound\nbackend = RuntimeBackend()\ntry:\n backend.execute("SELECT * FROM TEST_SCHEMA.__RANDOM__")\n return "FAILED"\nexcept NotFound as e:\n return "PASSED"\n]: databricks.sdk.errors.base.DatabricksError: CalledProcessError: Command 'pip --disable-pip-version-check install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.ECVN/wheels/databricks_labs_lsql-0.10.1+1320240918080712-py3-none-any.whl' returned non-zero exit status 1. (1m33.764s)
databricks.sdk.errors.base.DatabricksError: CalledProcessError: Command 'pip --disable-pip-version-check install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.ECVN/wheels/databricks_labs_lsql-0.10.1+1320240918080712-py3-none-any.whl' returned non-zero exit status 1.
08:07 DEBUG [databricks.sdk] Loaded from environment
08:07 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
08:07 INFO [databricks.sdk] Using Databricks Metadata Service authentication
[gw0] linux -- Python 3.10.14 /home/runner/work/lsql/lsql/.venv/bin/python
08:07 DEBUG [databricks.sdk] Loaded from environment
08:07 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
08:07 INFO [databricks.sdk] Using Databricks Metadata Service authentication
08:07 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
08:07 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmpz_fcc_n0/working-copy in /tmp/tmpz_fcc_n0
08:07 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.ECVN/wheels/databricks_labs_lsql-0.10.1+1320240918080712-py3-none-any.whl
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.ECVN/wheels) does not exist."
< }
08:07 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.ECVN/wheels
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.ECVN/wheels"
> }
< 200 OK
< {}
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 1012020733694320
< }
08:07 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
08:07 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.ECVN/version.json
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 1012020733694324
< }
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~1s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~2s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~3s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~4s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~5s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 4.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 16384,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.0.11",
<     "instance_id": "e37d5f4ffd02498bb190a0d4a72c0737",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "560e17f495304afc84798f2c1dd928a6",
<     "private_ip": "10.179.2.11",
<     "public_dns": "",
<     "start_timestamp": 1726646658304
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1726646778279,
<   "last_restarted_time": 1726646849201,
<   "last_state_loss_time": 1726646849144,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 4115190500459142486,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "RUNNING",
<   "state_message": ""
< }
08:07 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "7811100830701523247"
< }
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~3s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~4s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~5s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~6s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~7s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~8s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~9s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (111 more bytes)",
>   "contextId": "7811100830701523247",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "a2724b81af2c446f92117adfca944568"
< }
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=a2724b81af2c446f92117adfca944568&contextId=7811100830701523247
< 200 OK
< {
<   "id": "a2724b81af2c446f92117adfca944568",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=a2724b81af2c446f92117adfca944568, context_id=7811100830701523247: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=a2724b81af2c446f92117adfca944568&contextId=7811100830701523247
< 200 OK
< {
<   "id": "a2724b81af2c446f92117adfca944568",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=a2724b81af2c446f92117adfca944568, context_id=7811100830701523247: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=a2724b81af2c446f92117adfca944568&contextId=7811100830701523247
< 200 OK
< {
<   "id": "a2724b81af2c446f92117adfca944568",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=a2724b81af2c446f92117adfca944568, context_id=7811100830701523247: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=a2724b81af2c446f92117adfca944568&contextId=7811100830701523247
< 200 OK
< {
<   "id": "a2724b81af2c446f92117adfca944568",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=a2724b81af2c446f92117adfca944568, context_id=7811100830701523247: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~4s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=a2724b81af2c446f92117adfca944568&contextId=7811100830701523247
< 200 OK
< {
<   "id": "a2724b81af2c446f92117adfca944568",
<   "results": {
<     "cause": "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m\n\u001b[0;31mCa... (3344 more bytes)",
<     "resultType": "error",
<     "summary": "<span class='ansi-red-fg'>CalledProcessError</span>: Command 'pip --disable-pip-version-check in... (168 more bytes)"
<   },
<   "status": "Finished"
< }
08:07 DEBUG [databricks.sdk] Loaded from environment
08:07 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
08:07 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
08:07 INFO [databricks.sdk] Using Databricks Metadata Service authentication
08:07 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
08:07 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmpz_fcc_n0/working-copy in /tmp/tmpz_fcc_n0
08:07 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.ECVN/wheels/databricks_labs_lsql-0.10.1+1320240918080712-py3-none-any.whl
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.ECVN/wheels) does not exist."
< }
08:07 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.ECVN/wheels
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.ECVN/wheels"
> }
< 200 OK
< {}
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 1012020733694320
< }
08:07 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
08:07 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.ECVN/version.json
08:07 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 1012020733694324
< }
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~1s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~2s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~3s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~4s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "last_activity_time": 1726640878539,
<   "last_restarted_time": 1726646487207,
<   "last_state_loss_time": 1726640792325,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7050911989976746520,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "PENDING",
<   "state_message": "Starting Spark"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID: (State.PENDING) Starting Spark (sleeping ~5s)
08:07 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 4.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 16384,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "serge.smertin@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "serge.smertin@databricks.com",
<     "DatabricksInstanceGroupId": "-5988854595292306722",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.0.11",
<     "instance_id": "e37d5f4ffd02498bb190a0d4a72c0737",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "560e17f495304afc84798f2c1dd928a6",
<     "private_ip": "10.179.2.11",
<     "public_dns": "",
<     "start_timestamp": 1726646658304
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D4s_v3",
<   "effective_spark_version": "15.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1726646778279,
<   "last_restarted_time": 1726646849201,
<   "last_state_loss_time": 1726646849144,
<   "node_type_id": "Standard_D4s_v3",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 4115190500459142486,
<   "spark_version": "15.4.x-scala2.12",
<   "spec": {
<     "apply_policy_TEST_SCHEMA_values": false,
<     "autotermination_minutes": 60,
<     "CLOUD_ENV_attributes": {},
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "enable_local_disk_encryption": false,
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "15.4.x-scala2.12"
<   },
<   "start_time": 1720469141075,
<   "state": "RUNNING",
<   "state_message": ""
< }
08:07 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "7811100830701523247"
< }
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~3s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~4s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~5s)
08:07 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:07 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~6s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~7s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~8s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Pending"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7811100830701523247: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~9s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7811100830701523247
< 200 OK
< {
<   "id": "7811100830701523247",
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (111 more bytes)",
>   "contextId": "7811100830701523247",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "a2724b81af2c446f92117adfca944568"
< }
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=a2724b81af2c446f92117adfca944568&contextId=7811100830701523247
< 200 OK
< {
<   "id": "a2724b81af2c446f92117adfca944568",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=a2724b81af2c446f92117adfca944568, context_id=7811100830701523247: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=a2724b81af2c446f92117adfca944568&contextId=7811100830701523247
< 200 OK
< {
<   "id": "a2724b81af2c446f92117adfca944568",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=a2724b81af2c446f92117adfca944568, context_id=7811100830701523247: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=a2724b81af2c446f92117adfca944568&contextId=7811100830701523247
< 200 OK
< {
<   "id": "a2724b81af2c446f92117adfca944568",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=a2724b81af2c446f92117adfca944568, context_id=7811100830701523247: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=a2724b81af2c446f92117adfca944568&contextId=7811100830701523247
< 200 OK
< {
<   "id": "a2724b81af2c446f92117adfca944568",
<   "results": null,
<   "status": "Running"
< }
08:08 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=a2724b81af2c446f92117adfca944568, context_id=7811100830701523247: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~4s)
08:08 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=a2724b81af2c446f92117adfca944568&contextId=7811100830701523247
< 200 OK
< {
<   "id": "a2724b81af2c446f92117adfca944568",
<   "results": {
<     "cause": "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m\n\u001b[0;31mCa... (3344 more bytes)",
<     "resultType": "error",
<     "summary": "<span class='ansi-red-fg'>CalledProcessError</span>: Command 'pip --disable-pip-version-check in... (168 more bytes)"
<   },
<   "status": "Finished"
< }
[gw0] linux -- Python 3.10.14 /home/runner/work/lsql/lsql/.venv/bin/python

Flaky tests:

  • 🤪 test_dashboards_creates_dashboard_with_filters (9.176s)
  • 🤪 test_dashboards_creates_dashboard_with_widget_title_and_description (8.921s)
  • 🤪 test_runtime_backend_use_statements (1m33.924s)

Running from acceptance #407

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant