Skip to content

Commit

Permalink
Google provider: Remove bigquery_conn_id, `google_cloud_storage_con…
Browse files Browse the repository at this point in the history
…n_id` (#23326)

* `bigquery_conn_id` is removed. Please use `gcp_conn_id`.
  affected classes:
  `BigQueryCheckOperator`
  `BigQueryCreateEmptyDatasetOperator`
  `BigQueryDeleteDatasetOperator`
  `BigQueryDeleteTableOperator`
  `BigQueryExecuteQueryOperator`
  `BigQueryGetDataOperator`
  `BigQueryHook`
  `BigQueryIntervalCheckOperator`
  `BigQueryTableExistenceSensor`
  `BigQueryTablePartitionExistenceSensor`
  `BigQueryToBigQueryOperator`
  `BigQueryToGCSOperator`
  `BigQueryUpdateTableSchemaOperator`
  `BigQueryUpsertTableOperator`
  `BigQueryValueCheckOperator`
  `GCSToBigQueryOperator`

* `google_cloud_storage_conn_id` is removed. Please use `gcp_conn_id`.
  affected classes:
  `ADLSToGCSOperator`
  `BaseSQLToGCSOperator`
  `CassandraToGCSOperator`
  `GCSBucketCreateAclEntryOperator`
  `GCSCreateBucketOperator`
  `GCSDeleteObjectsOperator`
  `GCSHook`
  `GCSListObjectsOperator`
  `GCSObjectCreateAclEntryOperator`
  `GCSToBigQueryOperator`
  `GCSToGCSOperator`
  `GCSToLocalFilesystemOperator`
  `LocalFilesystemToGCSOperator`
  • Loading branch information
eladkal committed Apr 29, 2022
1 parent dd7002d commit 359dc58
Show file tree
Hide file tree
Showing 19 changed files with 42 additions and 379 deletions.
35 changes: 35 additions & 0 deletions airflow/providers/google/CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,41 @@ Breaking changes

* ``CloudDatastoreExportEntitiesOperator`` : Remove ``xcom_push``. Please use ``BaseOperator.do_xcom_push``

* ``bigquery_conn_id`` is removed. Please use ``gcp_conn_id``.
affected classes:
``BigQueryCheckOperator``
``BigQueryCreateEmptyDatasetOperator``
``BigQueryDeleteDatasetOperator``
``BigQueryDeleteTableOperator``
``BigQueryExecuteQueryOperator``
``BigQueryGetDataOperator``
``BigQueryHook``
``BigQueryIntervalCheckOperator``
``BigQueryTableExistenceSensor``
``BigQueryTablePartitionExistenceSensor``
``BigQueryToBigQueryOperator``
``BigQueryToGCSOperator``
``BigQueryUpdateTableSchemaOperator``
``BigQueryUpsertTableOperator``
``BigQueryValueCheckOperator``
``GCSToBigQueryOperator``

* ``google_cloud_storage_conn_id`` is removed. Please use ``gcp_conn_id``.
affected classes:
``ADLSToGCSOperator``
``BaseSQLToGCSOperator``
``CassandraToGCSOperator``
``GCSBucketCreateAclEntryOperator``
``GCSCreateBucketOperator``
``GCSDeleteObjectsOperator``
``GCSHook``
``GCSListObjectsOperator``
``GCSObjectCreateAclEntryOperator``
``GCSToBigQueryOperator``
``GCSToGCSOperator``
``GCSToLocalFilesystemOperator``
``LocalFilesystemToGCSOperator``

* ``BigQueryHook.create_empty_table`` Remove ``num_retries``. Please use ``retry``.

* ``BigQueryHook.run_grant_dataset_view_access`` Remove ``source_project``. Please use ``project_id``.
Expand Down
12 changes: 0 additions & 12 deletions airflow/providers/google/cloud/hooks/bigquery.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,6 @@ class BigQueryHook(GoogleBaseHook, DbApiHook):
:param delegate_to: This performs a task on one host with reference to other hosts.
:param use_legacy_sql: This specifies whether to use legacy SQL dialect.
:param location: The location of the BigQuery resource.
:param bigquery_conn_id: The Airflow connection used for BigQuery credentials.
:param api_resource_configs: This contains params configuration applied for Google BigQuery jobs.
:param impersonation_chain: This is the optional service account to impersonate using short term
credentials.
Expand All @@ -87,21 +86,10 @@ def __init__(
delegate_to: Optional[str] = None,
use_legacy_sql: bool = True,
location: Optional[str] = None,
bigquery_conn_id: Optional[str] = None,
api_resource_configs: Optional[Dict] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
labels: Optional[Dict] = None,
) -> None:
# To preserve backward compatibility
# TODO: remove one day
if bigquery_conn_id:
warnings.warn(
"The bigquery_conn_id parameter has been deprecated. You should pass "
"the gcp_conn_id parameter.",
DeprecationWarning,
stacklevel=2,
)
gcp_conn_id = bigquery_conn_id
super().__init__(
gcp_conn_id=gcp_conn_id,
delegate_to=delegate_to,
Expand Down
12 changes: 0 additions & 12 deletions airflow/providers/google/cloud/hooks/gcs.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,6 @@
import os
import shutil
import time
import warnings
from contextlib import contextmanager
from datetime import datetime
from functools import partial
Expand Down Expand Up @@ -133,19 +132,8 @@ def __init__(
self,
gcp_conn_id: str = "google_cloud_default",
delegate_to: Optional[str] = None,
google_cloud_storage_conn_id: Optional[str] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
) -> None:
# To preserve backward compatibility
# TODO: remove one day
if google_cloud_storage_conn_id:
warnings.warn(
"The google_cloud_storage_conn_id parameter has been deprecated. You should pass "
"the gcp_conn_id parameter.",
DeprecationWarning,
stacklevel=2,
)
gcp_conn_id = google_cloud_storage_conn_id
super().__init__(
gcp_conn_id=gcp_conn_id,
delegate_to=delegate_to,
Expand Down

0 comments on commit 359dc58

Please sign in to comment.