Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow dbt to cancel connections #718

Merged
merged 18 commits into from
Apr 13, 2024

Conversation

holly-evans
Copy link
Contributor

@holly-evans holly-evans commented Feb 24, 2024

resolves #705
docs dbt-labs/docs.getdbt.com/#

Problem

Interrupting dbt executions does not kill currently running queries. The dbt process does not stop until queries have completed. dbt issues a pg_terminate_backend command for the wrong pid.

Solution

This PR wraps Connection.handle to capture the backend pid upon initiation and uses that connection-specific pid to cancel connections. Storing the pid before the model query begins ensures that we can run select pg_terminate_backend(pid) upon cancel, even when a query is in progress.

I tried simply switching _get_backend_id to use the connection passed in to cancel, but it could not query for the pid until the running query finished.

Context

  1. Currently, select pg_backend_pid() is called at the time of cancel, on the master connection. This pid is connection-specific, so when select pg_terminate_backend(pid) is called with that pid, it's not canceling the query of the connection passed to cancel, but a non-existent query on master.

  2. Interrupting a dbt execution worked up until dbt-redshift 1.5 because psycopg2 stored a backend pid on the handle that was used to cancel, but redshift_connector does not have this pid available.

  3. If a connection is actively querying, attempting to query select pg_backend_pid() on that connection is delayed until the active query finishes (which may never happen in case of a blocking lock).

  4. dbt-databricks uses a connection wrapper to augment the handle, which inspired this approach.

Checklist

  • I have read the contributing guide and understand what's expected of me
  • I have run this code in development and it appears to resolve the stated issue
  • This PR includes tests, or tests are not required/relevant for this PR
    • I have not ran the functional tests but it was successful in my project. I was able to interrupt a run on a locked table
  • This PR has no interface changes (e.g. macros, cli, logs, json artifacts, config files, adapter interface, etc) or this PR has already received feedback and approval from Product or DX

Copy link

cla-bot bot commented Feb 24, 2024

Thanks for your pull request, and welcome to our community! We require contributors to sign our Contributor License Agreement and we don't seem to have your signature on file. Check out this article for more information on why we have a CLA.

In order for us to review and merge your code, please submit the Individual Contributor License Agreement form attached above above. If you have questions about the CLA, or if you believe you've received this message in error, please reach out through a comment on this PR.

CLA has not been signed by users: @holly-evans

@holly-evans
Copy link
Contributor Author

Signed the CLA just now.

@holly-evans
Copy link
Contributor Author

I locked dbt_hevans.accounting_feature_value before the dbt command to simulate my issue. CTRL+C has no effect until I rollback the lock.

Output before:

╰─➤  dbt seed -s accounting_feature_value
02:01:42  Running with dbt=1.7.8
02:01:43  Registered adapter: redshift=1.7.3
02:01:44  Found 439 models, 261 tests, 12 seeds, 4 operations, 333 sources, 0 exposures, 0 metrics, 623 macros, 0 groups, 0 semantic models
02:01:44  
02:01:46  
02:01:46  Running 1 on-run-start hook
02:01:46  Skipping udf check and create. Target ("user") not in ['ci', 'docs']
02:01:46  1 of 1 START hook: ncsa_dbt.on-run-start.0 ..................................... [RUN]
02:01:46  1 of 1 OK hook: ncsa_dbt.on-run-start.0 ........................................ [OK in 0.00s]
02:01:46  
02:01:46  Concurrency: 4 threads (target='user')
02:01:46  
02:01:46  1 of 1 START seed file dbt_hevans.accounting_feature_value ..................... [RUN]
^C02:02:24  1 of 1 ERROR loading seed file dbt_hevans.accounting_feature_value ............. [ERROR in 37.95s]
02:02:24  
02:02:24  Finished running 1 hook in 0 hours 0 minutes and 40.08 seconds (40.08s).
02:02:24  Encountered an error:
{'S': 'ERROR', 'C': '42P05', 'M': 'prepared statement "redshift_connector_statement_21561_2" already exists', 'F': '../src/pg/src/backend/commands/commands_prepare.c', 'L': '645', 'R': 'StorePreparedStatement'}
02:02:24  Traceback (most recent call last):
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 359, in execute_nodes
    self.run_queue(pool)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 291, in run_queue
    self.job_queue.join()
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/graph/queue.py", line 198, in join
    self.inner.join()
  File "/Users/hevans/.pyenv/versions/3.11.4/lib/python3.11/queue.py", line 90, in join
    self.all_tasks_done.wait()
  File "/Users/hevans/.pyenv/versions/3.11.4/lib/python3.11/threading.py", line 320, in wait
    waiter.acquire()
KeyboardInterrupt

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/core.py", line 1793, in execute
    ps = cache["ps"][key]
         ~~~~~~~~~~~^^^^^
KeyError: ('select pg_terminate_backend(1073873737)', ())

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 91, in wrapper
    result, success = func(*args, **kwargs)
                      ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 76, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 169, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 198, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 245, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 278, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/main.py", line 761, in seed
    results = task.run()
              ^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 474, in run
    result = self.execute_with_hooks(selected_uids)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 435, in execute_with_hooks
    res = self.execute_nodes()
          ^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 384, in execute_nodes
    self._cancel_connections(pool)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 332, in _cancel_connections
    for conn_name in adapter.cancel_open_connections():
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/adapters/base/impl.py", line 1250, in cancel_open_connections
    return self.connections.cancel_open()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/adapters/sql/connections.py", line 43, in cancel_open
    self.cancel(connection)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/adapters/redshift/connections.py", line 260, in cancel
    cursor.execute(sql)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/cursor.py", line 248, in execute
    raise e
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/cursor.py", line 241, in execute
    self._c.execute(self, operation, args)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/core.py", line 1874, in execute
    self.handle_messages(cursor)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/core.py", line 2166, in handle_messages
    raise self.error
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/adapters/redshift/connections.py", line 273, in exception_handler
    yield
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/adapters/sql/connections.py", line 80, in add_query
    cursor.execute(sql, bindings)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/cursor.py", line 248, in execute
    raise e
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/cursor.py", line 241, in execute
    self._c.execute(self, operation, args)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/core.py", line 1959, in execute
    self.handle_messages(cursor)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/core.py", line 2166, in handle_messages
    raise self.error
redshift_connector.error.ProgrammingError: {'S': 'ERROR', 'C': '42P05', 'M': 'prepared statement "redshift_connector_statement_21561_2" already exists', 'F': '../src/pg/src/backend/commands/commands_prepare.c', 'L': '645', 'R': 'StorePreparedStatement'}

Output after:

╰─➤  dbt seed -s accounting_feature_value                                                                                                                 2 ↵
01:51:32  Running with dbt=1.7.8
01:51:32  Registered adapter: redshift=1.7.3
01:51:33  Found 439 models, 261 tests, 12 seeds, 4 operations, 333 sources, 0 exposures, 0 metrics, 623 macros, 0 groups, 0 semantic models
01:51:33  
01:51:35  
01:51:35  Running 1 on-run-start hook
01:51:35  Skipping udf check and create. Target ("user") not in ['ci', 'docs']
01:51:35  1 of 1 START hook: ncsa_dbt.on-run-start.0 ..................................... [RUN]
01:51:35  1 of 1 OK hook: ncsa_dbt.on-run-start.0 ........................................ [OK in 0.00s]
01:51:35  
01:51:35  Concurrency: 4 threads (target='user')
01:51:35  
01:51:35  1 of 1 START seed file dbt_hevans.accounting_feature_value ..................... [RUN]
^C01:51:43  CANCEL query seed.ncsa_dbt.accounting_feature_value ............................ [CANCEL]
01:51:43  1 of 1 ERROR loading seed file dbt_hevans.accounting_feature_value ............. [ERROR in 7.22s]
01:51:43  
01:51:43  Exited because of keyboard interrupt
01:51:43  
01:51:43  Done. PASS=0 WARN=0 ERROR=0 SKIP=0 TOTAL=0
01:51:43  
01:51:43  Finished running 1 hook in 0 hours 0 minutes and 9.80 seconds (9.80s).
01:51:43  Encountered an error:

01:51:43  Traceback (most recent call last):
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 91, in wrapper
    result, success = func(*args, **kwargs)
                      ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 76, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 169, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 198, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 245, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 278, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/main.py", line 761, in seed
    results = task.run()
              ^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 474, in run
    result = self.execute_with_hooks(selected_uids)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 435, in execute_with_hooks
    res = self.execute_nodes()
          ^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 359, in execute_nodes
    self.run_queue(pool)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 291, in run_queue
    self.job_queue.join()
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/graph/queue.py", line 198, in join
    self.inner.join()
  File "/Users/hevans/.pyenv/versions/3.11.4/lib/python3.11/queue.py", line 90, in join
    self.all_tasks_done.wait()
  File "/Users/hevans/.pyenv/versions/3.11.4/lib/python3.11/threading.py", line 320, in wait
    waiter.acquire()
KeyboardInterrupt

Logs before:


============================== 20:01:42.835193 | a747ba5e-3ec2-4e92-9a53-8d073e024a8f ==============================
�[0m20:01:42.835193 [info ] [MainThread]: Running with dbt=1.7.8
�[0m20:01:42.835607 [debug] [MainThread]: running dbt with arguments {'printer_width': '80', 'indirect_selection': 'eager', 'write_json': 'True', 'log_cache_events': 'False', 'partial_parse': 'True', 'cache_selected_only': 'False', 'profiles_dir': '.', 'version_check': 'True', 'fail_fast': 'False', 'log_path': 'logs', 'warn_error': 'None', 'debug': 'False', 'use_colors': 'True', 'use_experimental_parser': 'False', 'no_print': 'None', 'quiet': 'False', 'log_format': 'default', 'static_parser': 'True', 'warn_error_options': "WarnErrorOptions(include=['NoNodesForSelectionCriteria'], exclude=[])", 'invocation_command': 'dbt seed -s accounting_feature_value', 'target_path': 'None', 'introspect': 'True', 'send_anonymous_usage_stats': 'True'}
�[0m20:01:43.879378 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'project_id', 'label': 'a747ba5e-3ec2-4e92-9a53-8d073e024a8f', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x109d0be90>]}
�[0m20:01:43.919159 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'adapter_info', 'label': 'a747ba5e-3ec2-4e92-9a53-8d073e024a8f', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10af50250>]}
�[0m20:01:43.919667 [info ] [MainThread]: Registered adapter: redshift=1.7.3
�[0m20:01:43.928934 [debug] [MainThread]: checksum: 67f0013ca5f0bd43af9a0873dd50792fde83ef69de63b71cacd0b4ac656c52e5, vars: {}, profile: , target: , version: 1.7.8
�[0m20:01:44.121097 [debug] [MainThread]: Partial parsing enabled: 0 files deleted, 0 files added, 0 files changed.
�[0m20:01:44.121465 [debug] [MainThread]: Partial parsing enabled, no changes found, skipping parsing
�[0m20:01:44.176632 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'load_project', 'label': 'a747ba5e-3ec2-4e92-9a53-8d073e024a8f', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10e132e90>]}
�[0m20:01:44.318506 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': 'a747ba5e-3ec2-4e92-9a53-8d073e024a8f', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10ebd00d0>]}
�[0m20:01:44.318951 [info ] [MainThread]: Found 439 models, 261 tests, 12 seeds, 4 operations, 333 sources, 0 exposures, 0 metrics, 623 macros, 0 groups, 0 semantic models
�[0m20:01:44.319273 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'a747ba5e-3ec2-4e92-9a53-8d073e024a8f', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10ef67e90>]}
�[0m20:01:44.325456 [info ] [MainThread]: 
�[0m20:01:44.326117 [debug] [MainThread]: Acquiring new redshift connection 'master'
�[0m20:01:44.326951 [debug] [ThreadPool]: Acquiring new redshift connection 'list_fasttrackprod'
�[0m20:01:44.335487 [debug] [ThreadPool]: Using redshift connection "list_fasttrackprod"
�[0m20:01:44.335790 [debug] [ThreadPool]: On list_fasttrackprod: /* {"app": "dbt", "dbt_version": "1.7.8", "profile_name": "redshift", "target_name": "user", "connection_name": "list_fasttrackprod"} */

    select distinct nspname from pg_namespace
�[0m20:01:44.336010 [debug] [ThreadPool]: Opening a new connection, currently in state init
�[0m20:01:44.337764 [debug] [ThreadPool]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m20:01:44.338018 [debug] [ThreadPool]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m20:01:44.812924 [debug] [ThreadPool]: SQL status: SUCCESS in 0.0 seconds
�[0m20:01:44.816105 [debug] [ThreadPool]: On list_fasttrackprod: Close
�[0m20:01:44.829292 [debug] [ThreadPool]: Re-using an available connection from the pool (formerly list_fasttrackprod, now list_fasttrackprod_dbt_hevans)
�[0m20:01:44.834233 [debug] [ThreadPool]: Using redshift connection "list_fasttrackprod_dbt_hevans"
�[0m20:01:44.834556 [debug] [ThreadPool]: On list_fasttrackprod_dbt_hevans: BEGIN
�[0m20:01:44.834797 [debug] [ThreadPool]: Opening a new connection, currently in state closed
�[0m20:01:44.835136 [debug] [ThreadPool]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m20:01:44.835387 [debug] [ThreadPool]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m20:01:45.203080 [debug] [ThreadPool]: SQL status: SUCCESS in 0.0 seconds
�[0m20:01:45.203759 [debug] [ThreadPool]: Using redshift connection "list_fasttrackprod_dbt_hevans"
�[0m20:01:45.204207 [debug] [ThreadPool]: On list_fasttrackprod_dbt_hevans: /* {"app": "dbt", "dbt_version": "1.7.8", "profile_name": "redshift", "target_name": "user", "connection_name": "list_fasttrackprod_dbt_hevans"} */
select
        table_catalog as database,
        table_name as name,
        table_schema as schema,
        'table' as type
    from information_schema.tables
    where table_schema ilike 'dbt_hevans'
    and table_type = 'BASE TABLE'
    union all
    select
      table_catalog as database,
      table_name as name,
      table_schema as schema,
      case
        when view_definition ilike '%create materialized view%'
          then 'materialized_view'
        else 'view'
      end as type
    from information_schema.views
    where table_schema ilike 'dbt_hevans'
�[0m20:01:45.352032 [debug] [ThreadPool]: SQL status: SUCCESS in 0.0 seconds
�[0m20:01:45.353052 [debug] [ThreadPool]: On list_fasttrackprod_dbt_hevans: ROLLBACK
�[0m20:01:45.447105 [debug] [ThreadPool]: On list_fasttrackprod_dbt_hevans: Close
�[0m20:01:45.464908 [debug] [MainThread]: Using redshift connection "master"
�[0m20:01:45.465372 [debug] [MainThread]: On master: BEGIN
�[0m20:01:45.465617 [debug] [MainThread]: Opening a new connection, currently in state init
�[0m20:01:45.465948 [debug] [MainThread]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m20:01:45.466204 [debug] [MainThread]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m20:01:45.816609 [debug] [MainThread]: SQL status: SUCCESS in 0.0 seconds
�[0m20:01:45.817202 [debug] [MainThread]: Using redshift connection "master"
�[0m20:01:45.817636 [debug] [MainThread]: On master: /* {"app": "dbt", "dbt_version": "1.7.8", "profile_name": "redshift", "target_name": "user", "connection_name": "master"} */
with
    relation as (
        select
            pg_class.oid as relation_id,
            pg_class.relname as relation_name,
            pg_class.relnamespace as schema_id,
            pg_namespace.nspname as schema_name,
            pg_class.relkind as relation_type
        from pg_class
        join pg_namespace
          on pg_class.relnamespace = pg_namespace.oid
        where pg_namespace.nspname != 'information_schema'
          and pg_namespace.nspname not like 'pg\_%'
    ),
    dependency as (
        select distinct
            coalesce(pg_rewrite.ev_class, pg_depend.objid) as dep_relation_id,
            pg_depend.refobjid as ref_relation_id,
            pg_depend.refclassid as ref_class_id
        from pg_depend
        left join pg_rewrite
          on pg_depend.objid = pg_rewrite.oid
        where coalesce(pg_rewrite.ev_class, pg_depend.objid) != pg_depend.refobjid
    )

select distinct
    dep.schema_name as dependent_schema,
    dep.relation_name as dependent_name,
    ref.schema_name as referenced_schema,
    ref.relation_name as referenced_name
from dependency
join relation ref
    on dependency.ref_relation_id = ref.relation_id
join relation dep
    on dependency.dep_relation_id = dep.relation_id
�[0m20:01:46.077951 [debug] [MainThread]: SQL status: SUCCESS in 0.0 seconds
�[0m20:01:46.080469 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'a747ba5e-3ec2-4e92-9a53-8d073e024a8f', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10a454e10>]}
�[0m20:01:46.081600 [debug] [MainThread]: On master: ROLLBACK
�[0m20:01:46.207173 [debug] [MainThread]: Using redshift connection "master"
�[0m20:01:46.208499 [debug] [MainThread]: On master: BEGIN
�[0m20:01:46.253531 [debug] [MainThread]: SQL status: SUCCESS in 0.0 seconds
�[0m20:01:46.255605 [debug] [MainThread]: On master: COMMIT
�[0m20:01:46.257303 [debug] [MainThread]: Using redshift connection "master"
�[0m20:01:46.258226 [debug] [MainThread]: On master: COMMIT
�[0m20:01:46.344627 [debug] [MainThread]: SQL status: SUCCESS in 0.0 seconds
�[0m20:01:46.346532 [info ] [MainThread]: 
�[0m20:01:46.347300 [info ] [MainThread]: Running 1 on-run-start hook
�[0m20:01:46.359300 [info ] [MainThread]: Skipping udf check and create. Target ("user") not in ['ci', 'docs']
�[0m20:01:46.360305 [debug] [MainThread]: Writing injected SQL for node "operation.ncsa_dbt.ncsa_dbt-on-run-start-0"
�[0m20:01:46.364792 [info ] [MainThread]: 1 of 1 START hook: ncsa_dbt.on-run-start.0 ..................................... [RUN]
�[0m20:01:46.365700 [info ] [MainThread]: 1 of 1 OK hook: ncsa_dbt.on-run-start.0 ........................................ [�[32mOK�[0m in 0.00s]
�[0m20:01:46.366276 [info ] [MainThread]: 
�[0m20:01:46.366774 [debug] [MainThread]: On master: Close
�[0m20:01:46.367747 [info ] [MainThread]: Concurrency: 4 threads (target='user')
�[0m20:01:46.368153 [info ] [MainThread]: 
�[0m20:01:46.370440 [debug] [Thread-1 (]: Began running node seed.ncsa_dbt.accounting_feature_value
�[0m20:01:46.371068 [info ] [Thread-1 (]: 1 of 1 START seed file dbt_hevans.accounting_feature_value ..................... [RUN]
�[0m20:01:46.371855 [debug] [Thread-1 (]: Re-using an available connection from the pool (formerly list_fasttrackprod_dbt_hevans, now seed.ncsa_dbt.accounting_feature_value)
�[0m20:01:46.372278 [debug] [Thread-1 (]: Began compiling node seed.ncsa_dbt.accounting_feature_value
�[0m20:01:46.372725 [debug] [Thread-1 (]: Timing info for seed.ncsa_dbt.accounting_feature_value (compile): 20:01:46.372531 => 20:01:46.372533
�[0m20:01:46.373091 [debug] [Thread-1 (]: Began executing node seed.ncsa_dbt.accounting_feature_value
�[0m20:01:46.394538 [debug] [Thread-1 (]: Using redshift connection "seed.ncsa_dbt.accounting_feature_value"
�[0m20:01:46.401432 [debug] [Thread-1 (]: On seed.ncsa_dbt.accounting_feature_value: BEGIN
�[0m20:01:46.401869 [debug] [Thread-1 (]: Opening a new connection, currently in state closed
�[0m20:01:46.404401 [debug] [Thread-1 (]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m20:01:46.404847 [debug] [Thread-1 (]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m20:01:46.749391 [debug] [Thread-1 (]: SQL status: SUCCESS in 0.0 seconds
�[0m20:01:46.751770 [debug] [Thread-1 (]: Using redshift connection "seed.ncsa_dbt.accounting_feature_value"
�[0m20:01:46.752781 [debug] [Thread-1 (]: On seed.ncsa_dbt.accounting_feature_value: /* {"app": "dbt", "dbt_version": "1.7.8", "profile_name": "redshift", "target_name": "user", "node_id": "seed.ncsa_dbt.accounting_feature_value"} */
truncate table "fasttrackprod"."dbt_hevans"."accounting_feature_value"
�[0m20:01:51.512610 [debug] [MainThread]: Using redshift connection "master"
�[0m20:01:51.513581 [debug] [MainThread]: On master: BEGIN
�[0m20:01:51.514175 [debug] [MainThread]: Opening a new connection, currently in state closed
�[0m20:01:51.515004 [debug] [MainThread]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m20:01:51.515589 [debug] [MainThread]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m20:01:51.917289 [debug] [MainThread]: SQL status: SUCCESS in 0.0 seconds
�[0m20:01:51.919877 [debug] [MainThread]: Using redshift connection "master"
�[0m20:01:51.921377 [debug] [MainThread]: On master: select pg_backend_pid()
�[0m20:01:52.013501 [debug] [MainThread]: SQL status: SUCCESS in 0.0 seconds
�[0m20:01:52.015733 [debug] [MainThread]: Redshift adapter: Cancel query on: 'seed.ncsa_dbt.accounting_feature_value' with PID: 1073873737
�[0m20:01:52.016603 [debug] [MainThread]: Redshift adapter: select pg_terminate_backend(1073873737)
�[0m20:02:20.734107 [debug] [Thread-1 (]: Redshift adapter: Redshift error: prepared statement "redshift_connector_statement_21561_2" already exists
�[0m20:02:24.291136 [debug] [Thread-1 (]: On seed.ncsa_dbt.accounting_feature_value: ROLLBACK
�[0m20:02:24.292387 [debug] [MainThread]: On master: ROLLBACK
�[0m20:02:24.294095 [debug] [Thread-1 (]: Redshift adapter: Error running SQL: macro truncate_relation
�[0m20:02:24.298082 [debug] [Thread-1 (]: Redshift adapter: Rolling back transaction.
�[0m20:02:24.299235 [debug] [Thread-1 (]: Timing info for seed.ncsa_dbt.accounting_feature_value (execute): 20:01:46.373431 => 20:02:24.298810
�[0m20:02:24.300471 [debug] [Thread-1 (]: On seed.ncsa_dbt.accounting_feature_value: Close
�[0m20:02:24.323889 [debug] [Thread-1 (]: Database Error in seed accounting_feature_value (seeds/accounting_feature_value.csv)
  prepared statement "redshift_connector_statement_21561_2" already exists
�[0m20:02:24.325627 [debug] [Thread-1 (]: Sending event: {'category': 'dbt', 'action': 'run_model', 'label': 'a747ba5e-3ec2-4e92-9a53-8d073e024a8f', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10e170bd0>]}
�[0m20:02:24.326994 [error] [Thread-1 (]: 1 of 1 ERROR loading seed file dbt_hevans.accounting_feature_value ............. [�[31mERROR�[0m in 37.95s]
�[0m20:02:24.328112 [debug] [Thread-1 (]: Finished running node seed.ncsa_dbt.accounting_feature_value
�[0m20:02:24.399099 [debug] [MainThread]: On master: Close
�[0m20:02:24.402328 [debug] [MainThread]: Connection 'master' was properly closed.
�[0m20:02:24.403249 [debug] [MainThread]: Connection 'seed.ncsa_dbt.accounting_feature_value' was properly closed.
�[0m20:02:24.404049 [info ] [MainThread]: 
�[0m20:02:24.404771 [info ] [MainThread]: Finished running 1 hook in 0 hours 0 minutes and 40.08 seconds (40.08s).
�[0m20:02:24.405777 [error] [MainThread]: Encountered an error:
{'S': 'ERROR', 'C': '42P05', 'M': 'prepared statement "redshift_connector_statement_21561_2" already exists', 'F': '../src/pg/src/backend/commands/commands_prepare.c', 'L': '645', 'R': 'StorePreparedStatement'}
�[0m20:02:24.411528 [error] [MainThread]: Traceback (most recent call last):
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 359, in execute_nodes
    self.run_queue(pool)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 291, in run_queue
    self.job_queue.join()
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/graph/queue.py", line 198, in join
    self.inner.join()
  File "/Users/hevans/.pyenv/versions/3.11.4/lib/python3.11/queue.py", line 90, in join
    self.all_tasks_done.wait()
  File "/Users/hevans/.pyenv/versions/3.11.4/lib/python3.11/threading.py", line 320, in wait
    waiter.acquire()
KeyboardInterrupt

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/core.py", line 1793, in execute
    ps = cache["ps"][key]
         ~~~~~~~~~~~^^^^^
KeyError: ('select pg_terminate_backend(1073873737)', ())

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 91, in wrapper
    result, success = func(*args, **kwargs)
                      ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 76, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 169, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 198, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 245, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 278, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/main.py", line 761, in seed
    results = task.run()
              ^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 474, in run
    result = self.execute_with_hooks(selected_uids)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 435, in execute_with_hooks
    res = self.execute_nodes()
          ^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 384, in execute_nodes
    self._cancel_connections(pool)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 332, in _cancel_connections
    for conn_name in adapter.cancel_open_connections():
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/adapters/base/impl.py", line 1250, in cancel_open_connections
    return self.connections.cancel_open()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/adapters/sql/connections.py", line 43, in cancel_open
    self.cancel(connection)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/adapters/redshift/connections.py", line 260, in cancel
    cursor.execute(sql)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/cursor.py", line 248, in execute
    raise e
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/cursor.py", line 241, in execute
    self._c.execute(self, operation, args)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/core.py", line 1874, in execute
    self.handle_messages(cursor)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/core.py", line 2166, in handle_messages
    raise self.error
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/adapters/redshift/connections.py", line 273, in exception_handler
    yield
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/adapters/sql/connections.py", line 80, in add_query
    cursor.execute(sql, bindings)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/cursor.py", line 248, in execute
    raise e
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/cursor.py", line 241, in execute
    self._c.execute(self, operation, args)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/core.py", line 1959, in execute
    self.handle_messages(cursor)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/redshift_connector/core.py", line 2166, in handle_messages
    raise self.error
redshift_connector.error.ProgrammingError: {'S': 'ERROR', 'C': '42P05', 'M': 'prepared statement "redshift_connector_statement_21561_2" already exists', 'F': '../src/pg/src/backend/commands/commands_prepare.c', 'L': '645', 'R': 'StorePreparedStatement'}

�[0m20:02:24.417688 [debug] [MainThread]: Resource report: {"command_name": "seed", "command_wall_clock_time": 41.613743, "process_user_time": 1.723778, "process_kernel_time": 0.269604, "process_mem_max_rss": "174112768", "command_success": false, "process_in_blocks": "0", "process_out_blocks": "0"}
�[0m20:02:24.419152 [debug] [MainThread]: Command `dbt seed` failed at 20:02:24.418889 after 41.62 seconds
�[0m20:02:24.420566 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10a398dd0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x104cc3050>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x109ae2d10>]}
�[0m20:02:24.421466 [debug] [MainThread]: Flushing usage events

Logs after:


============================== 19:51:32.738464 | b7dd1bd3-9e57-4672-a52b-975ac4ef9d1d ==============================
�[0m19:51:32.738464 [info ] [MainThread]: Running with dbt=1.7.8
�[0m19:51:32.738855 [debug] [MainThread]: running dbt with arguments {'printer_width': '80', 'indirect_selection': 'eager', 'write_json': 'True', 'log_cache_events': 'False', 'partial_parse': 'True', 'cache_selected_only': 'False', 'warn_error': 'None', 'fail_fast': 'False', 'profiles_dir': '.', 'log_path': 'logs', 'version_check': 'True', 'debug': 'False', 'use_colors': 'True', 'use_experimental_parser': 'False', 'no_print': 'None', 'quiet': 'False', 'log_format': 'default', 'introspect': 'True', 'invocation_command': 'dbt seed -s accounting_feature_value', 'static_parser': 'True', 'target_path': 'None', 'warn_error_options': "WarnErrorOptions(include=['NoNodesForSelectionCriteria'], exclude=[])", 'send_anonymous_usage_stats': 'True'}
�[0m19:51:32.876528 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'project_id', 'label': 'b7dd1bd3-9e57-4672-a52b-975ac4ef9d1d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x109652490>]}
�[0m19:51:32.916193 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'adapter_info', 'label': 'b7dd1bd3-9e57-4672-a52b-975ac4ef9d1d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x109233050>]}
�[0m19:51:32.916666 [info ] [MainThread]: Registered adapter: redshift=1.7.3
�[0m19:51:32.926025 [debug] [MainThread]: checksum: 67f0013ca5f0bd43af9a0873dd50792fde83ef69de63b71cacd0b4ac656c52e5, vars: {}, profile: , target: , version: 1.7.8
�[0m19:51:33.102542 [debug] [MainThread]: Partial parsing enabled: 0 files deleted, 0 files added, 0 files changed.
�[0m19:51:33.102884 [debug] [MainThread]: Partial parsing enabled, no changes found, skipping parsing
�[0m19:51:33.161989 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'load_project', 'label': 'b7dd1bd3-9e57-4672-a52b-975ac4ef9d1d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10a484810>]}
�[0m19:51:33.315616 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': 'b7dd1bd3-9e57-4672-a52b-975ac4ef9d1d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10aed9fd0>]}
�[0m19:51:33.316051 [info ] [MainThread]: Found 439 models, 261 tests, 12 seeds, 4 operations, 333 sources, 0 exposures, 0 metrics, 623 macros, 0 groups, 0 semantic models
�[0m19:51:33.316571 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'b7dd1bd3-9e57-4672-a52b-975ac4ef9d1d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10b18a450>]}
�[0m19:51:33.322701 [info ] [MainThread]: 
�[0m19:51:33.323389 [debug] [MainThread]: Acquiring new redshift connection 'master'
�[0m19:51:33.324145 [debug] [ThreadPool]: Acquiring new redshift connection 'list_fasttrackprod'
�[0m19:51:33.332396 [debug] [ThreadPool]: Using redshift connection "list_fasttrackprod"
�[0m19:51:33.332717 [debug] [ThreadPool]: On list_fasttrackprod: /* {"app": "dbt", "dbt_version": "1.7.8", "profile_name": "redshift", "target_name": "user", "connection_name": "list_fasttrackprod"} */

    select distinct nspname from pg_namespace
�[0m19:51:33.332960 [debug] [ThreadPool]: Opening a new connection, currently in state init
�[0m19:51:33.334765 [debug] [ThreadPool]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m19:51:33.335002 [debug] [ThreadPool]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m19:51:33.970634 [debug] [ThreadPool]: SQL status: SUCCESS in 1.0 seconds
�[0m19:51:33.972824 [debug] [ThreadPool]: On list_fasttrackprod: Close
�[0m19:51:33.984204 [debug] [ThreadPool]: Re-using an available connection from the pool (formerly list_fasttrackprod, now list_fasttrackprod_dbt_hevans)
�[0m19:51:33.988564 [debug] [ThreadPool]: Using redshift connection "list_fasttrackprod_dbt_hevans"
�[0m19:51:33.988867 [debug] [ThreadPool]: On list_fasttrackprod_dbt_hevans: BEGIN
�[0m19:51:33.989087 [debug] [ThreadPool]: Opening a new connection, currently in state closed
�[0m19:51:33.989390 [debug] [ThreadPool]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m19:51:33.989618 [debug] [ThreadPool]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m19:51:34.535806 [debug] [ThreadPool]: SQL status: SUCCESS in 1.0 seconds
�[0m19:51:34.536247 [debug] [ThreadPool]: Using redshift connection "list_fasttrackprod_dbt_hevans"
�[0m19:51:34.536551 [debug] [ThreadPool]: On list_fasttrackprod_dbt_hevans: /* {"app": "dbt", "dbt_version": "1.7.8", "profile_name": "redshift", "target_name": "user", "connection_name": "list_fasttrackprod_dbt_hevans"} */
select
        table_catalog as database,
        table_name as name,
        table_schema as schema,
        'table' as type
    from information_schema.tables
    where table_schema ilike 'dbt_hevans'
    and table_type = 'BASE TABLE'
    union all
    select
      table_catalog as database,
      table_name as name,
      table_schema as schema,
      case
        when view_definition ilike '%create materialized view%'
          then 'materialized_view'
        else 'view'
      end as type
    from information_schema.views
    where table_schema ilike 'dbt_hevans'
�[0m19:51:34.672930 [debug] [ThreadPool]: SQL status: SUCCESS in 0.0 seconds
�[0m19:51:34.674182 [debug] [ThreadPool]: On list_fasttrackprod_dbt_hevans: ROLLBACK
�[0m19:51:34.782163 [debug] [ThreadPool]: On list_fasttrackprod_dbt_hevans: Close
�[0m19:51:34.796417 [debug] [MainThread]: Using redshift connection "master"
�[0m19:51:34.796719 [debug] [MainThread]: On master: BEGIN
�[0m19:51:34.796917 [debug] [MainThread]: Opening a new connection, currently in state init
�[0m19:51:34.797206 [debug] [MainThread]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m19:51:34.797412 [debug] [MainThread]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m19:51:35.333706 [debug] [MainThread]: SQL status: SUCCESS in 1.0 seconds
�[0m19:51:35.334232 [debug] [MainThread]: Using redshift connection "master"
�[0m19:51:35.334513 [debug] [MainThread]: On master: /* {"app": "dbt", "dbt_version": "1.7.8", "profile_name": "redshift", "target_name": "user", "connection_name": "master"} */
with
    relation as (
        select
            pg_class.oid as relation_id,
            pg_class.relname as relation_name,
            pg_class.relnamespace as schema_id,
            pg_namespace.nspname as schema_name,
            pg_class.relkind as relation_type
        from pg_class
        join pg_namespace
          on pg_class.relnamespace = pg_namespace.oid
        where pg_namespace.nspname != 'information_schema'
          and pg_namespace.nspname not like 'pg\_%'
    ),
    dependency as (
        select distinct
            coalesce(pg_rewrite.ev_class, pg_depend.objid) as dep_relation_id,
            pg_depend.refobjid as ref_relation_id,
            pg_depend.refclassid as ref_class_id
        from pg_depend
        left join pg_rewrite
          on pg_depend.objid = pg_rewrite.oid
        where coalesce(pg_rewrite.ev_class, pg_depend.objid) != pg_depend.refobjid
    )

select distinct
    dep.schema_name as dependent_schema,
    dep.relation_name as dependent_name,
    ref.schema_name as referenced_schema,
    ref.relation_name as referenced_name
from dependency
join relation ref
    on dependency.ref_relation_id = ref.relation_id
join relation dep
    on dependency.dep_relation_id = dep.relation_id
�[0m19:51:35.579873 [debug] [MainThread]: SQL status: SUCCESS in 0.0 seconds
�[0m19:51:35.580916 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'b7dd1bd3-9e57-4672-a52b-975ac4ef9d1d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10af5f550>]}
�[0m19:51:35.581509 [debug] [MainThread]: On master: ROLLBACK
�[0m19:51:35.697352 [debug] [MainThread]: Using redshift connection "master"
�[0m19:51:35.697777 [debug] [MainThread]: On master: BEGIN
�[0m19:51:35.796805 [debug] [MainThread]: SQL status: SUCCESS in 0.0 seconds
�[0m19:51:35.797226 [debug] [MainThread]: On master: COMMIT
�[0m19:51:35.797533 [debug] [MainThread]: Using redshift connection "master"
�[0m19:51:35.797761 [debug] [MainThread]: On master: COMMIT
�[0m19:51:35.891393 [debug] [MainThread]: SQL status: SUCCESS in 0.0 seconds
�[0m19:51:35.891758 [info ] [MainThread]: 
�[0m19:51:35.892008 [info ] [MainThread]: Running 1 on-run-start hook
�[0m19:51:35.895901 [info ] [MainThread]: Skipping udf check and create. Target ("user") not in ['ci', 'docs']
�[0m19:51:35.896242 [debug] [MainThread]: Writing injected SQL for node "operation.ncsa_dbt.ncsa_dbt-on-run-start-0"
�[0m19:51:35.898260 [info ] [MainThread]: 1 of 1 START hook: ncsa_dbt.on-run-start.0 ..................................... [RUN]
�[0m19:51:35.898593 [info ] [MainThread]: 1 of 1 OK hook: ncsa_dbt.on-run-start.0 ........................................ [�[32mOK�[0m in 0.00s]
�[0m19:51:35.898818 [info ] [MainThread]: 
�[0m19:51:35.899077 [debug] [MainThread]: On master: Close
�[0m19:51:35.899580 [info ] [MainThread]: Concurrency: 4 threads (target='user')
�[0m19:51:35.899799 [info ] [MainThread]: 
�[0m19:51:35.901247 [debug] [Thread-1 (]: Began running node seed.ncsa_dbt.accounting_feature_value
�[0m19:51:35.901594 [info ] [Thread-1 (]: 1 of 1 START seed file dbt_hevans.accounting_feature_value ..................... [RUN]
�[0m19:51:35.902058 [debug] [Thread-1 (]: Re-using an available connection from the pool (formerly list_fasttrackprod_dbt_hevans, now seed.ncsa_dbt.accounting_feature_value)
�[0m19:51:35.902307 [debug] [Thread-1 (]: Began compiling node seed.ncsa_dbt.accounting_feature_value
�[0m19:51:35.902593 [debug] [Thread-1 (]: Timing info for seed.ncsa_dbt.accounting_feature_value (compile): 19:51:35.902466 => 19:51:35.902468
�[0m19:51:35.902828 [debug] [Thread-1 (]: Began executing node seed.ncsa_dbt.accounting_feature_value
�[0m19:51:35.918732 [debug] [Thread-1 (]: Using redshift connection "seed.ncsa_dbt.accounting_feature_value"
�[0m19:51:35.919106 [debug] [Thread-1 (]: On seed.ncsa_dbt.accounting_feature_value: BEGIN
�[0m19:51:35.919358 [debug] [Thread-1 (]: Opening a new connection, currently in state closed
�[0m19:51:35.921365 [debug] [Thread-1 (]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m19:51:35.921722 [debug] [Thread-1 (]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m19:51:36.450563 [debug] [Thread-1 (]: SQL status: SUCCESS in 1.0 seconds
�[0m19:51:36.451098 [debug] [Thread-1 (]: Using redshift connection "seed.ncsa_dbt.accounting_feature_value"
�[0m19:51:36.451463 [debug] [Thread-1 (]: On seed.ncsa_dbt.accounting_feature_value: /* {"app": "dbt", "dbt_version": "1.7.8", "profile_name": "redshift", "target_name": "user", "node_id": "seed.ncsa_dbt.accounting_feature_value"} */
truncate table "fasttrackprod"."dbt_hevans"."accounting_feature_value"
�[0m19:51:42.394131 [debug] [MainThread]: Redshift adapter: Cancel query on: 'seed.ncsa_dbt.accounting_feature_value' with PID: 1073783138
�[0m19:51:42.395165 [debug] [MainThread]: Redshift adapter: select pg_terminate_backend(1073783138)
�[0m19:51:42.396065 [debug] [MainThread]: Using redshift connection "master"
�[0m19:51:42.396655 [debug] [MainThread]: On master: BEGIN
�[0m19:51:42.397199 [debug] [MainThread]: Opening a new connection, currently in state closed
�[0m19:51:42.397965 [debug] [MainThread]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m19:51:42.398465 [debug] [MainThread]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m19:51:42.943225 [debug] [MainThread]: SQL status: SUCCESS in 1.0 seconds
�[0m19:51:42.944458 [debug] [MainThread]: Using redshift connection "master"
�[0m19:51:42.945221 [debug] [MainThread]: On master: select pg_terminate_backend(1073783138)
�[0m19:51:43.036226 [debug] [MainThread]: SQL status: SUCCESS in 0.0 seconds
�[0m19:51:43.037141 [error] [MainThread]: CANCEL query seed.ncsa_dbt.accounting_feature_value ............................ [�[31mCANCEL�[0m]
�[0m19:51:43.037953 [debug] [MainThread]: On master: ROLLBACK
�[0m19:51:43.111523 [debug] [Thread-1 (]: Redshift adapter: Error running SQL: /* {"app": "dbt", "dbt_version": "1.7.8", "profile_name": "redshift", "target_name": "user", "node_id": "seed.ncsa_dbt.accounting_feature_value"} */
truncate table "fasttrackprod"."dbt_hevans"."accounting_feature_value"
�[0m19:51:43.112052 [debug] [Thread-1 (]: Redshift adapter: Rolling back transaction.
�[0m19:51:43.112397 [debug] [Thread-1 (]: On seed.ncsa_dbt.accounting_feature_value: ROLLBACK
�[0m19:51:43.114209 [debug] [Thread-1 (]: Failed to rollback 'seed.ncsa_dbt.accounting_feature_value'
�[0m19:51:43.114656 [debug] [Thread-1 (]: Redshift adapter: Error running SQL: macro truncate_relation
�[0m19:51:43.114952 [debug] [Thread-1 (]: Redshift adapter: Rolling back transaction.
�[0m19:51:43.115348 [debug] [Thread-1 (]: Timing info for seed.ncsa_dbt.accounting_feature_value (execute): 19:51:35.902970 => 19:51:43.115184
�[0m19:51:43.115654 [debug] [Thread-1 (]: On seed.ncsa_dbt.accounting_feature_value: Close
�[0m19:51:43.119505 [debug] [Thread-1 (]: Runtime Error in seed accounting_feature_value (seeds/accounting_feature_value.csv)
  BrokenPipe: server socket closed. Please check that client side networking configurations such as Proxies, firewalls, VPN, etc. are not affecting your network connection.
�[0m19:51:43.119935 [debug] [MainThread]: On master: Close
�[0m19:51:43.120326 [debug] [Thread-1 (]: Sending event: {'category': 'dbt', 'action': 'run_model', 'label': 'b7dd1bd3-9e57-4672-a52b-975ac4ef9d1d', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10b033fd0>]}
�[0m19:51:43.120947 [error] [Thread-1 (]: 1 of 1 ERROR loading seed file dbt_hevans.accounting_feature_value ............. [�[31mERROR�[0m in 7.22s]
�[0m19:51:43.121766 [debug] [Thread-1 (]: Finished running node seed.ncsa_dbt.accounting_feature_value
�[0m19:51:43.122158 [info ] [MainThread]: 
�[0m19:51:43.122454 [info ] [MainThread]: �[33mExited because of keyboard interrupt�[0m
�[0m19:51:43.122712 [info ] [MainThread]: 
�[0m19:51:43.122971 [info ] [MainThread]: Done. PASS=0 WARN=0 ERROR=0 SKIP=0 TOTAL=0
�[0m19:51:43.123219 [debug] [MainThread]: Connection 'master' was properly closed.
�[0m19:51:43.123448 [debug] [MainThread]: Connection 'seed.ncsa_dbt.accounting_feature_value' was properly closed.
�[0m19:51:43.123688 [info ] [MainThread]: 
�[0m19:51:43.123935 [info ] [MainThread]: Finished running 1 hook in 0 hours 0 minutes and 9.80 seconds (9.80s).
�[0m19:51:43.124266 [error] [MainThread]: Encountered an error:

�[0m19:51:43.125286 [error] [MainThread]: Traceback (most recent call last):
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 91, in wrapper
    result, success = func(*args, **kwargs)
                      ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 76, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 169, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 198, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 245, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 278, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/main.py", line 761, in seed
    results = task.run()
              ^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 474, in run
    result = self.execute_with_hooks(selected_uids)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 435, in execute_with_hooks
    res = self.execute_nodes()
          ^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 359, in execute_nodes
    self.run_queue(pool)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 291, in run_queue
    self.job_queue.join()
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/graph/queue.py", line 198, in join
    self.inner.join()
  File "/Users/hevans/.pyenv/versions/3.11.4/lib/python3.11/queue.py", line 90, in join
    self.all_tasks_done.wait()
  File "/Users/hevans/.pyenv/versions/3.11.4/lib/python3.11/threading.py", line 320, in wait
    waiter.acquire()
KeyboardInterrupt

�[0m19:51:43.126499 [debug] [MainThread]: Resource report: {"command_name": "seed", "command_wall_clock_time": 10.417348, "process_user_time": 1.577609, "process_kernel_time": 0.234388, "process_mem_max_rss": "168263680", "command_success": false, "process_in_blocks": "0", "process_out_blocks": "0"}
�[0m19:51:43.126922 [debug] [MainThread]: Command `dbt seed` failed at 19:51:43.126820 after 10.42 seconds
�[0m19:51:43.127240 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x108f6b850>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x108ee4f10>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10875de10>]}
�[0m19:51:43.127534 [debug] [MainThread]: Flushing usage events

@cla-bot cla-bot bot added the cla:yes label Feb 26, 2024
@holly-evans
Copy link
Contributor Author

Hi, is there anything I can add to help this PR move forward?

Tagging @mikealfare, I see you on most of the PRs this year

@mikealfare mikealfare self-assigned this Mar 26, 2024
@mikealfare
Copy link
Contributor

Hi, is there anything I can add to help this PR move forward?

Tagging @mikealfare, I see you on most of the PRs this year

Thanks for your submission @holly-evans! This looks pretty good. I updated the branch and approved running integration tests. The one thing that jumps out is a missing changelog. Please refer to the steps here for adding a changelog entry. I'll review this sometime in the next week. Thanks again!

@mikealfare mikealfare requested review from mikealfare and a team and removed request for mikealfare March 26, 2024 17:08
@holly-evans
Copy link
Contributor Author

@mikealfare added changelog entry 👍

def cancel(self, connection: Connection):
try:
pid = self._get_backend_pid()
pid = connection.handle.backend_pid
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@holly-evans This is all looking very good but we do have some tests failing was looking at this a bit and here may be a good point doesn't look like we setup a handle object so we may need to ref the new ConnectionWrapper more directly.

If I can be of any more help please feel free to message me I will try to spend a little more time with this to give more advice as I can.

@martynydbt
Copy link
Contributor

@McKnight-42 Thank you for that feedback!

@holly-evans also wanted to apologize for the delay with this PR. We have pulled this into our current sprint and we are watching for your engagement and are aiming to get this reviewed and approved as soon as possible. Thank you for your patience. 🙏

@colin-rogers-dbt if this is ready for final approval while I'm out next week, can you please do the honors of getting it across the line? Thanks!

@martynydbt martynydbt assigned McKnight-42 and unassigned mikealfare Apr 6, 2024
@Fleid Fleid assigned mikealfare and unassigned McKnight-42 Apr 10, 2024
@mikealfare
Copy link
Contributor

Many of the failures are known flaky tests (already captured in another ticket). I'm rerunning CI to see if it changes what's passing. There were only maybe one or two failed tests that looked like they could be related to this.

Copy link
Contributor

@mikealfare mikealfare left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I left several comments, mostly around trying a different approach. As I mention below, I would have rather used your wrapper approach, but I could not figure out how to get it to work with some existing scenarios. While the approach I recommend appears to keep the existing tests passing, I'm eager to find out if you see a difference in the execution, and subsequently the log outputs that you previously shared. Please let me know how this works out, and if you see any issues with these recommendations. Thanks again for your contribution!

dbt/adapters/redshift/connections.py Outdated Show resolved Hide resolved
dbt/adapters/redshift/connections.py Outdated Show resolved Hide resolved
dbt/adapters/redshift/connections.py Outdated Show resolved Hide resolved
dbt/adapters/redshift/connections.py Outdated Show resolved Hide resolved
dbt/adapters/redshift/connections.py Show resolved Hide resolved
dbt/adapters/redshift/connections.py Outdated Show resolved Hide resolved
tests/unit/test_redshift_adapter.py Show resolved Hide resolved
tests/unit/test_redshift_adapter.py Show resolved Hide resolved
tests/unit/test_redshift_adapter.py Show resolved Hide resolved
@holly-evans
Copy link
Contributor Author

holly-evans commented Apr 12, 2024

Made the suggested changes and the unit tests are passing. Hoping the functional tests pass!

It works in my project too. I locked dbt_hevans.accounting_feature_value before the dbt command to simulate my issue then cancelled the run with CTRL + C.

�[0m11:32:27.969906 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10a08b490>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x109688410>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10a067bd0>]}


============================== 11:32:27.972256 | 0a19b297-cebb-4506-a78c-0f6e7be080ca ==============================
�[0m11:32:27.972256 [info ] [MainThread]: Running with dbt=1.8.0-b2
�[0m11:32:27.972599 [debug] [MainThread]: running dbt with arguments {'printer_width': '80', 'indirect_selection': 'eager', 'write_json': 'True', 'log_cache_events': 'False', 'partial_parse': 'True', 'cache_selected_only': 'False', 'warn_error': 'None', 'version_check': 'True', 'profiles_dir': '.', 'log_path': 'logs', 'fail_fast': 'False', 'debug': 'False', 'use_colors': 'True', 'use_experimental_parser': 'False', 'no_print': 'None', 'quiet': 'False', 'warn_error_options': "WarnErrorOptions(include=['NoNodesForSelectionCriteria'], exclude=[])", 'introspect': 'True', 'static_parser': 'True', 'log_format': 'default', 'target_path': 'None', 'invocation_command': 'dbt seed -s accounting_feature_value', 'send_anonymous_usage_stats': 'True'}
�[0m11:32:28.067676 [debug] [MainThread]: Redshift adapter: Setting redshift_connector to ERROR
�[0m11:32:28.067993 [debug] [MainThread]: Redshift adapter: Setting redshift_connector.core to ERROR
�[0m11:32:28.153388 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'project_id', 'label': '0a19b297-cebb-4506-a78c-0f6e7be080ca', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10b1e77d0>]}
�[0m11:32:28.181364 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'adapter_info', 'label': '0a19b297-cebb-4506-a78c-0f6e7be080ca', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x107f2e9d0>]}
�[0m11:32:28.182032 [info ] [MainThread]: Registered adapter: redshift=1.8.0-b1
�[0m11:32:28.197144 [debug] [MainThread]: checksum: 58709f380fc6436d4f850b67ab92be277542c64497f047992303b3c1836fe3f2, vars: {}, profile: , target: user, version: 1.8.0b2
�[0m11:32:28.457334 [debug] [MainThread]: Partial parsing enabled: 0 files deleted, 0 files added, 0 files changed.
�[0m11:32:28.457690 [debug] [MainThread]: Partial parsing enabled, no changes found, skipping parsing
�[0m11:32:28.551063 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'load_project', 'label': '0a19b297-cebb-4506-a78c-0f6e7be080ca', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10ace5650>]}
�[0m11:32:28.920594 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': '0a19b297-cebb-4506-a78c-0f6e7be080ca', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10a0f6310>]}
�[0m11:32:28.921041 [info ] [MainThread]: Found 493 models, 339 data tests, 12 seeds, 4 operations, 364 sources, 650 macros
�[0m11:32:28.921441 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': '0a19b297-cebb-4506-a78c-0f6e7be080ca', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10a0d57d0>]}
�[0m11:32:28.928516 [info ] [MainThread]: 
�[0m11:32:28.928976 [debug] [MainThread]: Acquiring new redshift connection 'master'
�[0m11:32:28.929696 [debug] [ThreadPool]: Acquiring new redshift connection 'list_fasttrackprod'
�[0m11:32:28.938735 [debug] [ThreadPool]: Using redshift connection "list_fasttrackprod"
�[0m11:32:28.939010 [debug] [ThreadPool]: On list_fasttrackprod: /* {"app": "dbt", "dbt_version": "1.8.0b2", "profile_name": "redshift", "target_name": "user", "connection_name": "list_fasttrackprod"} */

    select distinct nspname from pg_namespace
�[0m11:32:28.939188 [debug] [ThreadPool]: Opening a new connection, currently in state init
�[0m11:32:28.941172 [debug] [ThreadPool]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m11:32:28.941643 [debug] [ThreadPool]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m11:32:29.907128 [debug] [ThreadPool]: SQL status: SUCCESS in 1.0 seconds
�[0m11:32:29.913615 [debug] [ThreadPool]: On list_fasttrackprod: Close
�[0m11:32:29.942901 [debug] [ThreadPool]: Re-using an available connection from the pool (formerly list_fasttrackprod, now list_fasttrackprod_dbt_hevans)
�[0m11:32:29.948124 [debug] [ThreadPool]: Using redshift connection "list_fasttrackprod_dbt_hevans"
�[0m11:32:29.948423 [debug] [ThreadPool]: On list_fasttrackprod_dbt_hevans: BEGIN
�[0m11:32:29.948661 [debug] [ThreadPool]: Opening a new connection, currently in state closed
�[0m11:32:29.948966 [debug] [ThreadPool]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m11:32:29.949191 [debug] [ThreadPool]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m11:32:30.937452 [debug] [ThreadPool]: SQL status: SUCCESS in 1.0 seconds
�[0m11:32:30.939736 [debug] [ThreadPool]: Using redshift connection "list_fasttrackprod_dbt_hevans"
�[0m11:32:30.940850 [debug] [ThreadPool]: On list_fasttrackprod_dbt_hevans: /* {"app": "dbt", "dbt_version": "1.8.0b2", "profile_name": "redshift", "target_name": "user", "connection_name": "list_fasttrackprod_dbt_hevans"} */
select
        table_catalog as database,
        table_name as name,
        table_schema as schema,
        'table' as type
    from information_schema.tables
    where table_schema ilike 'dbt_hevans'
    and table_type = 'BASE TABLE'
    union all
    select
      table_catalog as database,
      table_name as name,
      table_schema as schema,
      case
        when view_definition ilike '%create materialized view%'
          then 'materialized_view'
        else 'view'
      end as type
    from information_schema.views
    where table_schema ilike 'dbt_hevans'
�[0m11:32:31.127147 [debug] [ThreadPool]: SQL status: SUCCESS in 0.0 seconds
�[0m11:32:31.132095 [debug] [ThreadPool]: On list_fasttrackprod_dbt_hevans: ROLLBACK
�[0m11:32:31.471669 [debug] [ThreadPool]: On list_fasttrackprod_dbt_hevans: Close
�[0m11:32:31.508541 [debug] [MainThread]: Using redshift connection "master"
�[0m11:32:31.508930 [debug] [MainThread]: On master: BEGIN
�[0m11:32:31.509166 [debug] [MainThread]: Opening a new connection, currently in state init
�[0m11:32:31.509489 [debug] [MainThread]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m11:32:31.509732 [debug] [MainThread]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m11:32:32.171877 [debug] [MainThread]: SQL status: SUCCESS in 1.0 seconds
�[0m11:32:32.174045 [debug] [MainThread]: Using redshift connection "master"
�[0m11:32:32.175341 [debug] [MainThread]: On master: /* {"app": "dbt", "dbt_version": "1.8.0b2", "profile_name": "redshift", "target_name": "user", "connection_name": "master"} */
with
    relation as (
        select
            pg_class.oid as relation_id,
            pg_class.relname as relation_name,
            pg_class.relnamespace as schema_id,
            pg_namespace.nspname as schema_name,
            pg_class.relkind as relation_type
        from pg_class
        join pg_namespace
          on pg_class.relnamespace = pg_namespace.oid
        where pg_namespace.nspname != 'information_schema'
          and pg_namespace.nspname not like 'pg\_%'
    ),
    dependency as (
        select distinct
            coalesce(pg_rewrite.ev_class, pg_depend.objid) as dep_relation_id,
            pg_depend.refobjid as ref_relation_id,
            pg_depend.refclassid as ref_class_id
        from pg_depend
        left join pg_rewrite
          on pg_depend.objid = pg_rewrite.oid
        where coalesce(pg_rewrite.ev_class, pg_depend.objid) != pg_depend.refobjid
    )

select distinct
    dep.schema_name as dependent_schema,
    dep.relation_name as dependent_name,
    ref.schema_name as referenced_schema,
    ref.relation_name as referenced_name
from dependency
join relation ref
    on dependency.ref_relation_id = ref.relation_id
join relation dep
    on dependency.dep_relation_id = dep.relation_id
�[0m11:32:32.442730 [debug] [MainThread]: SQL status: SUCCESS in 0.0 seconds
�[0m11:32:32.443797 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': '0a19b297-cebb-4506-a78c-0f6e7be080ca', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10fb6b710>]}
�[0m11:32:32.444267 [debug] [MainThread]: On master: ROLLBACK
�[0m11:32:32.580750 [debug] [MainThread]: Using redshift connection "master"
�[0m11:32:32.581100 [debug] [MainThread]: On master: BEGIN
�[0m11:32:32.734472 [debug] [MainThread]: SQL status: SUCCESS in 0.0 seconds
�[0m11:32:32.735285 [debug] [MainThread]: On master: COMMIT
�[0m11:32:32.735872 [debug] [MainThread]: Using redshift connection "master"
�[0m11:32:32.736293 [debug] [MainThread]: On master: COMMIT
�[0m11:32:32.966629 [debug] [MainThread]: SQL status: SUCCESS in 0.0 seconds
�[0m11:32:32.967059 [info ] [MainThread]: 
�[0m11:32:32.967327 [info ] [MainThread]: Running 1 on-run-start hook
�[0m11:32:32.971685 [info ] [MainThread]: Skipping udf check and create. Target ("user") not in ['ci', 'docs']
�[0m11:32:32.972111 [debug] [MainThread]: Writing injected SQL for node "operation.ncsa_dbt.ncsa_dbt-on-run-start-0"
�[0m11:32:32.974725 [info ] [MainThread]: 1 of 1 START hook: ncsa_dbt.on-run-start.0 ..................................... [RUN]
�[0m11:32:32.975131 [info ] [MainThread]: 1 of 1 OK hook: ncsa_dbt.on-run-start.0 ........................................ [�[32mOK�[0m in 0.00s]
�[0m11:32:32.975387 [info ] [MainThread]: 
�[0m11:32:32.975683 [debug] [MainThread]: On master: Close
�[0m11:32:32.976166 [info ] [MainThread]: Concurrency: 4 threads (target='user')
�[0m11:32:32.976493 [info ] [MainThread]: 
�[0m11:32:32.978600 [debug] [Thread-1 (]: Began running node seed.ncsa_dbt.accounting_feature_value
�[0m11:32:32.979011 [info ] [Thread-1 (]: 1 of 1 START seed file dbt_hevans.accounting_feature_value ..................... [RUN]
�[0m11:32:32.979305 [debug] [Thread-1 (]: Re-using an available connection from the pool (formerly list_fasttrackprod_dbt_hevans, now seed.ncsa_dbt.accounting_feature_value)
�[0m11:32:32.979508 [debug] [Thread-1 (]: Began compiling node seed.ncsa_dbt.accounting_feature_value
�[0m11:32:32.979734 [debug] [Thread-1 (]: Timing info for seed.ncsa_dbt.accounting_feature_value (compile): 11:32:32.979648 => 11:32:32.979650
�[0m11:32:32.979912 [debug] [Thread-1 (]: Began executing node seed.ncsa_dbt.accounting_feature_value
�[0m11:32:32.999110 [debug] [Thread-1 (]: Using redshift connection "seed.ncsa_dbt.accounting_feature_value"
�[0m11:32:32.999477 [debug] [Thread-1 (]: On seed.ncsa_dbt.accounting_feature_value: BEGIN
�[0m11:32:32.999700 [debug] [Thread-1 (]: Opening a new connection, currently in state closed
�[0m11:32:33.000027 [debug] [Thread-1 (]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m11:32:33.000262 [debug] [Thread-1 (]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m11:32:33.896115 [debug] [Thread-1 (]: SQL status: SUCCESS in 1.0 seconds
�[0m11:32:33.898157 [debug] [Thread-1 (]: Using redshift connection "seed.ncsa_dbt.accounting_feature_value"
�[0m11:32:33.899035 [debug] [Thread-1 (]: On seed.ncsa_dbt.accounting_feature_value: /* {"app": "dbt", "dbt_version": "1.8.0b2", "profile_name": "redshift", "target_name": "user", "node_id": "seed.ncsa_dbt.accounting_feature_value"} */
truncate table "fasttrackprod"."dbt_hevans"."accounting_feature_value"
�[0m11:32:37.334377 [debug] [MainThread]: Redshift adapter: Cancel query on: 'seed.ncsa_dbt.accounting_feature_value' with PID: 1073850719
�[0m11:32:37.334916 [debug] [MainThread]: Redshift adapter: select pg_terminate_backend(1073850719)
�[0m11:32:37.335436 [debug] [MainThread]: Using redshift connection "master"
�[0m11:32:37.335791 [debug] [MainThread]: On master: BEGIN
�[0m11:32:37.336223 [debug] [MainThread]: Opening a new connection, currently in state closed
�[0m11:32:37.336714 [debug] [MainThread]: Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
�[0m11:32:37.337035 [debug] [MainThread]: Redshift adapter: Connecting to redshift with username/password based auth...
�[0m11:32:38.579673 [debug] [MainThread]: SQL status: SUCCESS in 1.0 seconds
�[0m11:32:38.580052 [debug] [MainThread]: Using redshift connection "master"
�[0m11:32:38.580270 [debug] [MainThread]: On master: select pg_terminate_backend(1073850719)
�[0m11:32:38.679235 [debug] [MainThread]: SQL status: SUCCESS in 0.0 seconds
�[0m11:32:38.679643 [error] [MainThread]: CANCEL query seed.ncsa_dbt.accounting_feature_value ............................ [�[31mCANCEL�[0m]
�[0m11:32:38.680060 [debug] [MainThread]: On master: ROLLBACK
�[0m11:32:38.733236 [debug] [Thread-1 (]: Redshift adapter: Error running SQL: /* {"app": "dbt", "dbt_version": "1.8.0b2", "profile_name": "redshift", "target_name": "user", "node_id": "seed.ncsa_dbt.accounting_feature_value"} */
truncate table "fasttrackprod"."dbt_hevans"."accounting_feature_value"
�[0m11:32:38.733955 [debug] [Thread-1 (]: Redshift adapter: Rolling back transaction.
�[0m11:32:38.734257 [debug] [Thread-1 (]: On seed.ncsa_dbt.accounting_feature_value: ROLLBACK
�[0m11:32:38.735995 [debug] [Thread-1 (]: Failed to rollback 'seed.ncsa_dbt.accounting_feature_value'
�[0m11:32:38.736412 [debug] [Thread-1 (]: Redshift adapter: Error running SQL: macro truncate_relation
�[0m11:32:38.736658 [debug] [Thread-1 (]: Redshift adapter: Rolling back transaction.
�[0m11:32:38.736989 [debug] [Thread-1 (]: Timing info for seed.ncsa_dbt.accounting_feature_value (execute): 11:32:32.980046 => 11:32:38.736886
�[0m11:32:38.737212 [debug] [Thread-1 (]: On seed.ncsa_dbt.accounting_feature_value: Close
�[0m11:32:38.740408 [debug] [Thread-1 (]: Runtime Error in seed accounting_feature_value (seeds/accounting_feature_value.csv)
  BrokenPipe: server socket closed. Please check that client side networking configurations such as Proxies, firewalls, VPN, etc. are not affecting your network connection.
�[0m11:32:38.741300 [debug] [Thread-1 (]: Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '0a19b297-cebb-4506-a78c-0f6e7be080ca', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10b6de110>]}
�[0m11:32:38.741640 [error] [Thread-1 (]: 1 of 1 ERROR loading seed file dbt_hevans.accounting_feature_value ............. [�[31mERROR�[0m in 5.76s]
�[0m11:32:38.741971 [debug] [Thread-1 (]: Finished running node seed.ncsa_dbt.accounting_feature_value
�[0m11:32:38.810406 [debug] [MainThread]: On master: Close
�[0m11:32:38.811058 [info ] [MainThread]: 
�[0m11:32:38.811424 [info ] [MainThread]: �[33mExited because of keyboard interrupt�[0m
�[0m11:32:38.811681 [info ] [MainThread]: 
�[0m11:32:38.811960 [info ] [MainThread]: Done. PASS=0 WARN=0 ERROR=0 SKIP=0 TOTAL=0
�[0m11:32:38.812217 [debug] [MainThread]: Connection 'master' was properly closed.
�[0m11:32:38.812437 [debug] [MainThread]: Connection 'seed.ncsa_dbt.accounting_feature_value' was properly closed.
�[0m11:32:38.812677 [info ] [MainThread]: 
�[0m11:32:38.812929 [info ] [MainThread]: Finished running 1 project hook in 0 hours 0 minutes and 9.88 seconds (9.88s).
�[0m11:32:38.813280 [error] [MainThread]: Encountered an error:

�[0m11:32:38.814369 [error] [MainThread]: Traceback (most recent call last):
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 106, in wrapper
    result, success = func(*args, **kwargs)
                      ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 91, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 184, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 213, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 260, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/requires.py", line 298, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/cli/main.py", line 700, in seed
    results = task.run()
              ^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 537, in run
    result = self.execute_with_hooks(selected_uids)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 498, in execute_with_hooks
    res = self.execute_nodes()
          ^^^^^^^^^^^^^^^^^^^^
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 408, in execute_nodes
    self.run_queue(pool)
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/task/runnable.py", line 326, in run_queue
    self.job_queue.join()
  File "/Users/hevans/.local/share/virtualenvs/data_dbt-AeL2xOvX/lib/python3.11/site-packages/dbt/graph/queue.py", line 198, in join
    self.inner.join()
  File "/Users/hevans/.pyenv/versions/3.11.4/lib/python3.11/queue.py", line 90, in join
    self.all_tasks_done.wait()
  File "/Users/hevans/.pyenv/versions/3.11.4/lib/python3.11/threading.py", line 320, in wait
    waiter.acquire()
KeyboardInterrupt

�[0m11:32:38.815624 [debug] [MainThread]: Resource report: {"command_name": "seed", "command_wall_clock_time": 10.874433, "process_user_time": 1.575251, "process_kernel_time": 0.253991, "process_mem_max_rss": "165593088", "command_success": false, "process_in_blocks": "0", "process_out_blocks": "0"}
�[0m11:32:38.816025 [debug] [MainThread]: Command `dbt seed` failed at 11:32:38.815967 after 10.87 seconds
�[0m11:32:38.816312 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x104cdc890>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x104cdc850>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x10b6b4d10>]}
�[0m11:32:38.816587 [debug] [MainThread]: Flushing usage events

@holly-evans holly-evans requested a review from mikealfare April 12, 2024 19:54
dbt/adapters/redshift/connections.py Outdated Show resolved Hide resolved
dbt/adapters/redshift/connections.py Outdated Show resolved Hide resolved
Copy link
Contributor

@mikealfare mikealfare left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your help and effort @holly-evans!

@mikealfare mikealfare merged commit fdad756 into dbt-labs:main Apr 13, 2024
24 checks passed
mikealfare pushed a commit that referenced this pull request Apr 15, 2024
mikealfare added a commit that referenced this pull request Apr 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[ADAP-1099] [Bug] Interrupt (CTRL+C) is not cancelling the right query (wrong pid)
4 participants