Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[#104] Ignore the Forbidden exception in list_relations_without_caching #108

Merged
merged 5 commits into from
Feb 7, 2022

Conversation

yu-iskw
Copy link
Contributor

@yu-iskw yu-iskw commented Jan 25, 2022

resolves #104

Description

When we would like to deal with partital dbt models as parts of all models with a service account to access the partital ones, dbt gets failed due to the lack of permissions for other models. So, we ignore the google.api_core.exceptions.Forbidden exception so that we skip errors because of the lack of permissions.

By the way, how can we imlement unit tests for the update? I am not sure if we have multiple service accounts are available on the CI jobs.

Checklist

  • I have signed the CLA
  • I have run this code in development and it appears to resolve the stated issue
  • This PR includes tests, or tests are not required/relevant for this PR
  • I have updated the CHANGELOG.md and added information about my change to the "dbt-bigquery next" section.

What I tested

I used the repository to test the implementation, as I reported the issue #104.
https://github.com/yu-iskw/dbt-issue-with-multiple-service-accounts-on-bigquery

  • [passed] dbt compile for partital resources
  • [passed] dbt run for partital resources
  • [passed] dbt compile for every resources
  • [failed] dbt run for every resources as expected

dbt executions with a tag to select a sub graph

dbt compile passed
$ dbt compile --profiles-dir profiles --target dataset1 --select "tag:test_dataset1"

============================== 2022-01-25 01:33:24.566670 | 6b3528f6-1771-48d9-b1c8-1fe26789abbe ==============================
01:33:24.566670 [info ] [MainThread]: Running with dbt=1.0.1
01:33:24.567810 [debug] [MainThread]: running dbt with arguments Namespace(cls=<class 'dbt.task.compile.CompileTask'>, debug=None, defer=None, event_buffer_size=None, exclude=None, fail_fast=None, full_refresh=False, log_cache_events=False, log_format=None, parse_only=False, partial_parse=None, printer_width=None, profile=None, profiles_dir='/Users/yu/local/src/github/test-dbt-with-multiple-service-accounts/profiles', project_dir=None, record_timing_info=None, rpc_method='compile', select=['tag:test_dataset1'], selector_name=None, send_anonymous_usage_stats=None, single_threaded=False, state=None, static_parser=None, target='dataset1', threads=None, use_colors=None, use_experimental_parser=None, vars='{}', version_check=None, warn_error=None, which='compile', write_json=None)
01:33:24.568331 [debug] [MainThread]: Tracking: do not track
01:33:24.625768 [debug] [MainThread]: Partial parsing enabled: 0 files deleted, 0 files added, 2 files changed.
01:33:24.626649 [debug] [MainThread]: Partial parsing: updated file: dbt_issue_with_multiple_service_accounts://models/model_in_test_dataset2.sql
01:33:24.627239 [debug] [MainThread]: Partial parsing: updated file: dbt_issue_with_multiple_service_accounts://models/model_in_test_dataset1.sql
01:33:24.644270 [debug] [MainThread]: 1603: static parser failed on model_in_test_dataset2.sql
01:33:24.657173 [debug] [MainThread]: 1602: parser fallback to jinja rendering on model_in_test_dataset2.sql
01:33:24.658601 [debug] [MainThread]: 1603: static parser failed on model_in_test_dataset1.sql
01:33:24.662552 [debug] [MainThread]: 1602: parser fallback to jinja rendering on model_in_test_dataset1.sql
01:33:24.679586 [info ] [MainThread]: Found 2 models, 0 tests, 0 snapshots, 0 analyses, 189 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics
01:33:24.680798 [info ] [MainThread]: 
01:33:24.681366 [debug] [MainThread]: Acquiring new bigquery connection "master"
01:33:24.682254 [debug] [ThreadPool]: Acquiring new bigquery connection "list_your-gcp-project_test_dataset2"
01:33:24.682853 [debug] [ThreadPool]: Opening a new connection, currently in state init
01:33:25.675394 [info ] [ThreadPool]: BigQuery adapter: Exception: <google.api_core.page_iterator.HTTPIterator object at 0x7f7c00d3f310>
01:33:26.649409 [debug] [ThreadPool]: BigQuery adapter: Forbidden 403 GET https://bigquery.googleapis.com/bigquery/v2/projects/your-gcp-project/datasets/test_dataset2/tables?maxResults=100000&prettyPrint=false: Access Denied: Dataset your-gcp-project:test_dataset2: Permission bigquery.tables.list denied on dataset your-gcp-project:test_dataset2 (or it may not exist).
01:33:26.651278 [debug] [ThreadPool]: Acquiring new bigquery connection "list_your-gcp-project_test_dataset1"
01:33:26.652031 [debug] [ThreadPool]: Opening a new connection, currently in state closed
01:33:26.653247 [info ] [ThreadPool]: BigQuery adapter: Exception: <google.api_core.page_iterator.HTTPIterator object at 0x7f7be1921190>
01:33:27.548123 [info ] [MainThread]: Concurrency: 1 threads (target='dataset1')
01:33:27.549704 [info ] [MainThread]: 
01:33:27.560824 [debug] [Thread-1  ]: Began running node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:33:27.562170 [debug] [Thread-1  ]: Acquiring new bigquery connection "model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1"
01:33:27.562999 [debug] [Thread-1  ]: Began compiling node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:33:27.563689 [debug] [Thread-1  ]: Compiling model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:33:27.573772 [debug] [Thread-1  ]: Writing injected SQL for node "model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1"
01:33:27.575058 [debug] [Thread-1  ]: finished collecting timing info
01:33:27.575613 [debug] [Thread-1  ]: Began executing node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:33:27.576145 [debug] [Thread-1  ]: finished collecting timing info
01:33:27.576875 [debug] [Thread-1  ]: Finished running node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:33:27.578237 [debug] [MainThread]: Connection 'master' was properly closed.
01:33:27.578761 [debug] [MainThread]: Connection 'model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1' was properly closed.
01:33:27.584445 [info ] [MainThread]: Done.
01:33:27.585970 [debug] [MainThread]: Flushing usage events
dbt run passed
$ dbt run --profiles-dir profiles --target dataset1 --select "tag:test_dataset1"

============================== 2022-01-25 01:33:29.980047 | 8f78929f-710c-4d5e-b78c-4c94526621cc ==============================
01:33:29.980047 [info ] [MainThread]: Running with dbt=1.0.1
01:33:29.981402 [debug] [MainThread]: running dbt with arguments Namespace(cls=<class 'dbt.task.run.RunTask'>, debug=None, defer=None, event_buffer_size=None, exclude=None, fail_fast=None, full_refresh=False, log_cache_events=False, log_format=None, partial_parse=None, printer_width=None, profile=None, profiles_dir='/Users/yu/local/src/github/test-dbt-with-multiple-service-accounts/profiles', project_dir=None, record_timing_info=None, rpc_method='run', select=['tag:test_dataset1'], selector_name=None, send_anonymous_usage_stats=None, single_threaded=False, state=None, static_parser=None, target='dataset1', threads=None, use_colors=None, use_experimental_parser=None, vars='{}', version_check=None, warn_error=None, which='run', write_json=None)
01:33:29.982346 [debug] [MainThread]: Tracking: do not track
01:33:30.055664 [debug] [MainThread]: Partial parsing enabled: 0 files deleted, 0 files added, 0 files changed.
01:33:30.056256 [debug] [MainThread]: Partial parsing enabled, no changes found, skipping parsing
01:33:30.067523 [info ] [MainThread]: Found 2 models, 0 tests, 0 snapshots, 0 analyses, 189 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics
01:33:30.068845 [info ] [MainThread]: 
01:33:30.069465 [debug] [MainThread]: Acquiring new bigquery connection "master"
01:33:30.070372 [debug] [ThreadPool]: Acquiring new bigquery connection "list_your-gcp-project"
01:33:30.070808 [debug] [ThreadPool]: Opening a new connection, currently in state init
01:33:31.859280 [debug] [ThreadPool]: Acquiring new bigquery connection "list_your-gcp-project_test_dataset1"
01:33:31.861074 [debug] [ThreadPool]: Opening a new connection, currently in state closed
01:33:31.863372 [info ] [ThreadPool]: BigQuery adapter: Exception: <google.api_core.page_iterator.HTTPIterator object at 0x7f8488500940>
01:33:32.722866 [debug] [ThreadPool]: Acquiring new bigquery connection "list_your-gcp-project_test_dataset2"
01:33:32.723448 [debug] [ThreadPool]: Opening a new connection, currently in state closed
01:33:32.724164 [info ] [ThreadPool]: BigQuery adapter: Exception: <google.api_core.page_iterator.HTTPIterator object at 0x7f8478c5e5e0>
01:33:33.625051 [debug] [ThreadPool]: BigQuery adapter: Forbidden 403 GET https://bigquery.googleapis.com/bigquery/v2/projects/your-gcp-project/datasets/test_dataset2/tables?maxResults=100000&prettyPrint=false: Access Denied: Dataset your-gcp-project:test_dataset2: Permission bigquery.tables.list denied on dataset your-gcp-project:test_dataset2 (or it may not exist).
01:33:33.627717 [info ] [MainThread]: Concurrency: 1 threads (target='dataset1')
01:33:33.628866 [info ] [MainThread]: 
01:33:33.636191 [debug] [Thread-1  ]: Began running node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:33:33.637390 [info ] [Thread-1  ]: 1 of 1 START view model test_dataset1.model_in_test_dataset1.................... [RUN]
01:33:33.638489 [debug] [Thread-1  ]: Acquiring new bigquery connection "model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1"
01:33:33.639545 [debug] [Thread-1  ]: Began compiling node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:33:33.640428 [debug] [Thread-1  ]: Compiling model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:33:33.646552 [debug] [Thread-1  ]: Writing injected SQL for node "model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1"
01:33:33.647779 [debug] [Thread-1  ]: finished collecting timing info
01:33:33.648371 [debug] [Thread-1  ]: Began executing node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:33:33.685286 [debug] [Thread-1  ]: Writing runtime SQL for node "model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1"
01:33:33.686241 [debug] [Thread-1  ]: Opening a new connection, currently in state closed
01:33:33.686800 [debug] [Thread-1  ]: On model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1: /* {"app": "dbt", "dbt_version": "1.0.1", "profile_name": "default", "target_name": "dataset1", "node_id": "model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1"} */


  create or replace view `your-gcp-project`.`test_dataset1`.`model_in_test_dataset1`
  OPTIONS()
  as 



SELECT 1 AS x;


01:33:35.268845 [debug] [Thread-1  ]: finished collecting timing info
01:33:35.270291 [info ] [Thread-1  ]: 1 of 1 OK created view model test_dataset1.model_in_test_dataset1............... [OK in 1.63s]
01:33:35.271398 [debug] [Thread-1  ]: Finished running node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:33:35.273680 [debug] [MainThread]: Acquiring new bigquery connection "master"
01:33:35.274935 [info ] [MainThread]: 
01:33:35.275981 [info ] [MainThread]: Finished running 1 view model in 5.21s.
01:33:35.276862 [debug] [MainThread]: Connection 'master' was properly closed.
01:33:35.277470 [debug] [MainThread]: Connection 'model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1' was properly closed.
01:33:35.284728 [info ] [MainThread]: 
01:33:35.285299 [info ] [MainThread]: Completed successfully
01:33:35.285854 [info ] [MainThread]: 
01:33:35.286313 [info ] [MainThread]: Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1
01:33:35.292078 [debug] [MainThread]: Flushing usage events

dbt executions with no model selection

dbt compile
$ dbt compile --profiles-dir profiles --target dataset1

============================== 2022-01-25 01:34:40.446469 | ddbad033-76b2-4c19-a694-4a4b3185f225 ==============================
01:34:40.446469 [info ] [MainThread]: Running with dbt=1.0.1
01:34:40.447366 [debug] [MainThread]: running dbt with arguments Namespace(cls=<class 'dbt.task.compile.CompileTask'>, debug=None, defer=None, event_buffer_size=None, exclude=None, fail_fast=None, full_refresh=False, log_cache_events=False, log_format=None, parse_only=False, partial_parse=None, printer_width=None, profile=None, profiles_dir='/Users/yu/local/src/github/test-dbt-with-multiple-service-accounts/profiles', project_dir=None, record_timing_info=None, rpc_method='compile', select=None, selector_name=None, send_anonymous_usage_stats=None, single_threaded=False, state=None, static_parser=None, target='dataset1', threads=None, use_colors=None, use_experimental_parser=None, vars='{}', version_check=None, warn_error=None, which='compile', write_json=None)
01:34:40.447955 [debug] [MainThread]: Tracking: do not track
01:34:40.525479 [debug] [MainThread]: Partial parsing enabled: 0 files deleted, 0 files added, 0 files changed.
01:34:40.526101 [debug] [MainThread]: Partial parsing enabled, no changes found, skipping parsing
01:34:40.537291 [info ] [MainThread]: Found 2 models, 0 tests, 0 snapshots, 0 analyses, 189 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics
01:34:40.538580 [info ] [MainThread]: 
01:34:40.539134 [debug] [MainThread]: Acquiring new bigquery connection "master"
01:34:40.540116 [debug] [ThreadPool]: Acquiring new bigquery connection "list_your-gcp-project_test_dataset1"
01:34:40.540703 [debug] [ThreadPool]: Opening a new connection, currently in state init
01:34:41.614110 [info ] [ThreadPool]: BigQuery adapter: Exception: <google.api_core.page_iterator.HTTPIterator object at 0x7fe8284e7a30>
01:34:42.545999 [debug] [ThreadPool]: Acquiring new bigquery connection "list_your-gcp-project_test_dataset2"
01:34:42.547793 [debug] [ThreadPool]: Opening a new connection, currently in state closed
01:34:42.549810 [info ] [ThreadPool]: BigQuery adapter: Exception: <google.api_core.page_iterator.HTTPIterator object at 0x7fe8284e7df0>
01:34:43.450037 [debug] [ThreadPool]: BigQuery adapter: Forbidden 403 GET https://bigquery.googleapis.com/bigquery/v2/projects/your-gcp-project/datasets/test_dataset2/tables?maxResults=100000&prettyPrint=false: Access Denied: Dataset your-gcp-project:test_dataset2: Permission bigquery.tables.list denied on dataset your-gcp-project:test_dataset2 (or it may not exist).
01:34:43.452949 [info ] [MainThread]: Concurrency: 1 threads (target='dataset1')
01:34:43.454151 [info ] [MainThread]: 
01:34:43.465423 [debug] [Thread-1  ]: Began running node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:34:43.466747 [debug] [Thread-1  ]: Acquiring new bigquery connection "model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1"
01:34:43.467562 [debug] [Thread-1  ]: Began compiling node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:34:43.468394 [debug] [Thread-1  ]: Compiling model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:34:43.477096 [debug] [Thread-1  ]: Writing injected SQL for node "model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1"
01:34:43.478883 [debug] [Thread-1  ]: finished collecting timing info
01:34:43.479943 [debug] [Thread-1  ]: Began executing node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:34:43.481218 [debug] [Thread-1  ]: finished collecting timing info
01:34:43.483117 [debug] [Thread-1  ]: Finished running node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset1
01:34:43.484385 [debug] [Thread-1  ]: Began running node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset2
01:34:43.485739 [debug] [Thread-1  ]: Acquiring new bigquery connection "model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset2"
01:34:43.486401 [debug] [Thread-1  ]: Began compiling node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset2
01:34:43.487023 [debug] [Thread-1  ]: Compiling model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset2
01:34:43.491736 [debug] [Thread-1  ]: Writing injected SQL for node "model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset2"
01:34:43.492691 [debug] [Thread-1  ]: finished collecting timing info
01:34:43.493140 [debug] [Thread-1  ]: Began executing node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset2
01:34:43.493559 [debug] [Thread-1  ]: finished collecting timing info
01:34:43.494180 [debug] [Thread-1  ]: Finished running node model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset2
01:34:43.495362 [debug] [MainThread]: Connection 'master' was properly closed.
01:34:43.495839 [debug] [MainThread]: Connection 'model.dbt_issue_with_multiple_service_accounts.model_in_test_dataset2' was properly closed.
01:34:43.501154 [info ] [MainThread]: Done.
01:34:43.502182 [debug] [MainThread]: Flushing usage events
dbt run failed
$ dbt run --profiles-dir profiles --target dataset1

============================== 2022-01-25 01:34:45.950084 | 9e308a83-4a37-40a9-bfd5-46016cf3afe7 ==============================
01:34:45.950084 [info ] [MainThread]: Running with dbt=1.0.1
01:34:45.950815 [debug] [MainThread]: running dbt with arguments Namespace(cls=<class 'dbt.task.run.RunTask'>, debug=None, defer=None, event_buffer_size=None, exclude=None, fail_fast=None, full_refresh=False, log_cache_events=False, log_format=None, partial_parse=None, printer_width=None, profile=None, profiles_dir='/Users/yu/local/src/github/test-dbt-with-multiple-service-accounts/profiles', project_dir=None, record_timing_info=None, rpc_method='run', select=None, selector_name=None, send_anonymous_usage_stats=None, single_threaded=False, state=None, static_parser=None, target='dataset1', threads=None, use_colors=None, use_experimental_parser=None, vars='{}', version_check=None, warn_error=None, which='run', write_json=None)
01:34:45.951385 [debug] [MainThread]: Tracking: do not track
01:34:46.007188 [debug] [MainThread]: Partial parsing enabled: 0 files deleted, 0 files added, 0 files changed.
01:34:46.007837 [debug] [MainThread]: Partial parsing enabled, no changes found, skipping parsing
01:34:46.020586 [info ] [MainThread]: Found 2 models, 0 tests, 0 snapshots, 0 analyses, 189 macros, 0 operations, 0 seed files, 0 sources, 0 exposures, 0 metrics
01:34:46.021878 [info ] [MainThread]: 
01:34:46.022434 [debug] [MainThread]: Acquiring new bigquery connection "master"
01:34:46.023597 [debug] [ThreadPool]: Acquiring new bigquery connection "list_your-gcp-project"
01:34:46.024041 [debug] [ThreadPool]: Opening a new connection, currently in state init
01:34:48.259091 [debug] [ThreadPool]: Acquiring new bigquery connection "list_your-gcp-project"
01:34:48.261542 [debug] [ThreadPool]: Opening a new connection, currently in state closed
01:34:49.557734 [debug] [ThreadPool]: Acquiring new bigquery connection "create_your-gcp-project_test_dataset2"
01:34:49.559708 [debug] [ThreadPool]: Acquiring new bigquery connection "create_your-gcp-project_test_dataset2"
01:34:49.560697 [debug] [ThreadPool]: BigQuery adapter: Creating schema "your-gcp-project.test_dataset2".
01:34:49.561625 [debug] [ThreadPool]: Opening a new connection, currently in state closed
01:34:50.445098 [debug] [MainThread]: Connection 'master' was properly closed.
01:34:50.446066 [debug] [MainThread]: Connection 'create_your-gcp-project_test_dataset2' was properly closed.
01:34:50.448136 [debug] [MainThread]: Flushing usage events
01:34:50.449264 [error] [MainThread]: Encountered an error:
Database Error
  Access Denied: Project your-gcp-project: User does not have bigquery.datasets.create permission in project your-gcp-project.
01:34:50.473654 [debug] [MainThread]: Traceback (most recent call last):
  File "/Users/yu/local/src/github/dbt-bigquery/dbt/adapters/bigquery/connections.py", line 181, in exception_handler
    yield
  File "/Users/yu/local/src/github/dbt-bigquery/dbt/adapters/bigquery/connections.py", line 590, in _retry_and_handle
    return retry.retry_target(
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/google/api_core/retry.py", line 190, in retry_target
    return target()
  File "/Users/yu/local/src/github/dbt-bigquery/dbt/adapters/bigquery/connections.py", line 560, in fn
    return client.create_dataset(dataset, exists_ok=True)
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/google/cloud/bigquery/client.py", line 610, in create_dataset
    api_response = self._call_api(
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/google/cloud/bigquery/client.py", line 760, in _call_api
    return call()
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/google/api_core/retry.py", line 283, in retry_wrapped_func
    return retry_target(
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/google/api_core/retry.py", line 190, in retry_target
    return target()
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/google/cloud/_http/__init__.py", line 480, in api_request
    raise exceptions.from_http_response(response)
google.api_core.exceptions.Forbidden: 403 POST https://bigquery.googleapis.com/bigquery/v2/projects/your-gcp-project/datasets?prettyPrint=false: Access Denied: Project your-gcp-project: User does not have bigquery.datasets.create permission in project your-gcp-project.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/dbt/main.py", line 127, in main
    results, succeeded = handle_and_check(args)
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/dbt/main.py", line 192, in handle_and_check
    task, res = run_from_args(parsed)
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/dbt/main.py", line 246, in run_from_args
    results = task.run()
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/dbt/task/runnable.py", line 476, in run
    result = self.execute_with_hooks(selected_uids)
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/dbt/task/runnable.py", line 431, in execute_with_hooks
    self.before_run(adapter, selected_uids)
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/dbt/task/run.py", line 457, in before_run
    self.create_schemas(adapter, selected_uids)
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/dbt/task/runnable.py", line 587, in create_schemas
    create_future.result()
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/concurrent/futures/_base.py", line 437, in result
    return self.__get_result()
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/concurrent/futures/_base.py", line 389, in __get_result
    raise self._exception
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/concurrent/futures/thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/dbt/utils.py", line 469, in connected
    return func(*args, **kwargs)
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/site-packages/dbt/task/runnable.py", line 550, in create_schema
    adapter.create_schema(relation)
  File "/Users/yu/local/src/github/dbt-bigquery/dbt/adapters/bigquery/impl.py", line 326, in create_schema
    self.connections.create_dataset(database, schema)
  File "/Users/yu/local/src/github/dbt-bigquery/dbt/adapters/bigquery/connections.py", line 561, in create_dataset
    self._retry_and_handle(msg='create dataset', conn=conn, fn=fn)
  File "/Users/yu/local/src/github/dbt-bigquery/dbt/adapters/bigquery/connections.py", line 590, in _retry_and_handle
    return retry.retry_target(
  File "/Users/yu/anaconda2/envs/dbt-bigquery-dev/lib/python3.8/contextlib.py", line 131, in __exit__
    self.gen.throw(type, value, traceback)
  File "/Users/yu/local/src/github/dbt-bigquery/dbt/adapters/bigquery/connections.py", line 189, in exception_handler
    self.handle_error(e, message)
  File "/Users/yu/local/src/github/dbt-bigquery/dbt/adapters/bigquery/connections.py", line 173, in handle_error
    raise DatabaseException(error_msg)
dbt.exceptions.DatabaseException: Database Error
  Access Denied: Project your-gcp-project: User does not have bigquery.datasets.create permission in project your-gcp-project.

@cla-bot cla-bot bot added the cla:yes label Jan 25, 2022
@McKnight-42 McKnight-42 self-requested a review January 26, 2022 18:44
@yu-iskw
Copy link
Contributor Author

yu-iskw commented Jan 31, 2022

@McKnight-42 I don't understand the release cycle after separating dbt-bigquery from dbt-core. Do you know when is the next release? I just would like to use the update as soon as possible.

@McKnight-42
Copy link
Contributor

McKnight-42 commented Jan 31, 2022

@yu-iskw Hi sorry for not responding sooner, in regards to your CI question we have added a new env variable containing a second gcp to test against, currently working on making sure the integration tests will work properly. as for the release schedule now that adapters are separate. We are still working on the actual release schedule definition for when we will be doing patches. Thank you so much for working on this enhancement.

@yu-iskw
Copy link
Contributor Author

yu-iskw commented Feb 2, 2022

@McKnight-42 Thank you for supporting me in #111 . I look forward to resolving the tests. If there is anything I can help, please let me know.

@McKnight-42
Copy link
Contributor

@yu-iskw any thoughts you have about #111 are more than welcome would love to it a full test.

as for this repo, if you could update to the lates version of main and re push up that should clear up any changlog conflicts your having and we should be able to review your PR after that.

@yu-iskw
Copy link
Contributor Author

yu-iskw commented Feb 7, 2022

@McKnight-42 I have resolved the conflicts in comparison with the default branch. I will look into #111 so that we enhance the unit tests. Many thanks!

Copy link
Contributor

@McKnight-42 McKnight-42 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM thank you for putting in the time on this.

@McKnight-42 McKnight-42 merged commit 87095c4 into dbt-labs:main Feb 7, 2022
@yu-iskw
Copy link
Contributor Author

yu-iskw commented Feb 8, 2022

@McKnight-42 It's my pleasure. Thank you for supporting me too.

@yu-iskw
Copy link
Contributor Author

yu-iskw commented Feb 9, 2022

@McKnight-42 The CI jobs on the main branch were failed. Is there anything I should fix?
https://github.com/dbt-labs/dbt-bigquery/actions/runs/1807765134

@McKnight-42
Copy link
Contributor

@yu-iskw Your good, these failures are due to different changes.

@jtcohen6 jtcohen6 mentioned this pull request Feb 17, 2022
4 tasks
siephen pushed a commit to AgencyPMG/dbt-bigquery that referenced this pull request May 16, 2022
…thout_caching` (dbt-labs#108)

* [dbt-labs#104] Ignore the forbidden exception in `list_relations_without_caching`

* Update CHANGELOG.md

* Further update CHANGELOG.md

Co-authored-by: Matthew McKnight <91097623+McKnight-42@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Can't deal with partial resources due to the lack of permissions for other resources
2 participants