Skip to content

Commit

Permalink
Update Google Cloud branding (#10642)
Browse files Browse the repository at this point in the history
  • Loading branch information
mik-laj committed Aug 29, 2020
1 parent 934115b commit 2ca615c
Show file tree
Hide file tree
Showing 92 changed files with 515 additions and 523 deletions.
4 changes: 2 additions & 2 deletions BREEZE.rst
Original file line number Diff line number Diff line change
Expand Up @@ -433,15 +433,15 @@ Those are currently installed CLIs (they are available as aliases to the docker
+-----------------------+----------+-------------------------------------------------+-------------------+
| Microsoft Azure | az | mcr.microsoft.com/azure-cli:latest | .azure |
+-----------------------+----------+-------------------------------------------------+-------------------+
| Google Cloud Platform | bq | gcr.io/google.com/cloudsdktool/cloud-sdk:latest | .config/gcloud |
| Google Cloud | bq | gcr.io/google.com/cloudsdktool/cloud-sdk:latest | .config/gcloud |
| +----------+-------------------------------------------------+-------------------+
| | gcloud | gcr.io/google.com/cloudsdktool/cloud-sdk:latest | .config/gcloud |
| +----------+-------------------------------------------------+-------------------+
| | gsutil | gcr.io/google.com/cloudsdktool/cloud-sdk:latest | .config/gcloud |
+-----------------------+----------+-------------------------------------------------+-------------------+

For each of the CLIs we have also an accompanying ``*-update`` alias (for example ``aws-update``) which
will pull the latest image for the tool. Note that all Google Cloud Platform tools are served by one
will pull the latest image for the tool. Note that all Google Cloud tools are served by one
image and they are updated together.

Also - in case you run several different Breeze containers in parallel (from different directories,
Expand Down
2 changes: 1 addition & 1 deletion CONTRIBUTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -817,7 +817,7 @@ We support the following types of tests:
additional services running, such as Postgres, Mysql, Kerberos, etc.

* **System tests** are automatic tests that use external systems like
Google Cloud Platform. These tests are intended for an end-to-end DAG execution.
Google Cloud. These tests are intended for an end-to-end DAG execution.

For details on running different types of Airflow tests, see `TESTING.rst <TESTING.rst>`_.

Expand Down
6 changes: 3 additions & 3 deletions TESTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ Airflow Test Infrastructure
marked as integration tests but soon they will be separated by ``pytest`` annotations.

* **System tests** are automatic tests that use external systems like
Google Cloud Platform. These tests are intended for an end-to-end DAG execution.
Google Cloud. These tests are intended for an end-to-end DAG execution.
The tests can be executed on both the current version of Apache Airflow and any of the older
versions from 1.10.* series.

Expand Down Expand Up @@ -612,7 +612,7 @@ visible to anything that you have installed inside the Docker container.
Currently forwarded credentials are:
* credentials stored in ``${HOME}/.aws`` for the aws Amazon Web Services client
* credentials stored in ``${HOME}/.azure`` for the az Microsoft Azure client
* credentials stored in ``${HOME}/.config`` for gcloud Google Cloud Platform client (among others)
* credentials stored in ``${HOME}/.config`` for gcloud Google Cloud client (among others)
* credentials stored in ``${HOME}/.docker`` for docker client

Adding a New System Test
Expand Down Expand Up @@ -878,7 +878,7 @@ your local sources to the ``/opt/airflow`` location of the sources within the co
Setup VM on GCP with SSH forwarding
-----------------------------------

Below are the steps you need to take to set up your virtual machine in the Google Cloud Platform.
Below are the steps you need to take to set up your virtual machine in the Google Cloud.

1. The next steps will assume that you have configured environment variables with the name of the network and
a virtual machine, project ID and the zone where the virtual machine will be created
Expand Down
24 changes: 12 additions & 12 deletions UPDATING.md
Original file line number Diff line number Diff line change
Expand Up @@ -521,7 +521,7 @@ The following configurations have been moved from `[core]` to the new `[logging]
#### Remove gcp_service_account_keys option in airflow.cfg file

This option has been removed because it is no longer supported by the Google Kubernetes Engine. The new
recommended service account keys for the Google Cloud Platform management method is
recommended service account keys for the Google Cloud management method is
[Workload Identity](https://cloud.google.com/kubernetes-engine/docs/how-to/workload-identity).

#### Fernet is enabled by default
Expand Down Expand Up @@ -1037,17 +1037,17 @@ have been made to the core (including core operators) as they can affect the int
of this provider.

This section describes the changes that have been made, and what you need to do to update your if
you use operators or hooks which integrate with Google services (including Google Cloud Platform - GCP).
you use operators or hooks which integrate with Google services (including Google Cloud - GCP).

#### Direct impersonation added to operators communicating with Google services
[Directly impersonating a service account](https://cloud.google.com/iam/docs/understanding-service-accounts#directly_impersonating_a_service_account)
has been made possible for operators communicating with Google services via new argument called `impersonation_chain`
(`google_impersonation_chain` in case of operators that also communicate with services of other cloud providers).
As a result, GCSToS3Operator no longer derivatives from GCSListObjectsOperator.

#### Normalize gcp_conn_id for Google Cloud Platform
#### Normalize gcp_conn_id for Google Cloud

Previously not all hooks and operators related to Google Cloud Platform use
Previously not all hooks and operators related to Google Cloud use
`gcp_conn_id` as parameter for GCP connection. There is currently one parameter
which apply to most services. Parameters like ``datastore_conn_id``, ``bigquery_conn_id``,
``google_cloud_storage_conn_id`` and similar have been deprecated. Operators that require two connections are not changed.
Expand Down Expand Up @@ -1082,7 +1082,7 @@ Following components were affected by normalization:
#### Changes to import paths and names of GCP operators and hooks

According to [AIP-21](https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-21%3A+Changes+in+import+paths)
operators related to Google Cloud Platform has been moved from contrib to core.
operators related to Google Cloud has been moved from contrib to core.
The following table shows changes in import paths.

| Old path | New path |
Expand Down Expand Up @@ -1265,9 +1265,9 @@ The following table shows changes in import paths.
|airflow.contrib.sensors.gcs_sensor.GoogleCloudStorageUploadSessionCompleteSensor |airflow.providers.google.cloud.sensors.gcs.GCSUploadSessionCompleteSensor |
|airflow.contrib.sensors.pubsub_sensor.PubSubPullSensor |airflow.providers.google.cloud.sensors.pubsub.PubSubPullSensor |

#### Unify default conn_id for Google Cloud Platform
#### Unify default conn_id for Google Cloud

Previously not all hooks and operators related to Google Cloud Platform use
Previously not all hooks and operators related to Google Cloud use
``google_cloud_default`` as a default conn_id. There is currently one default
variant. Values like ``google_cloud_storage_default``, ``bigquery_default``,
``google_cloud_datastore_default`` have been deprecated. The configuration of
Expand Down Expand Up @@ -1408,7 +1408,7 @@ Now this parameter requires a value. To restore the previous behavior, configure
specifying the service account.

Detailed information about connection management is available:
[Google Cloud Platform Connection](https://airflow.apache.org/howto/connection/gcp.html).
[Google Cloud Connection](https://airflow.apache.org/howto/connection/gcp.html).


#### `airflow.providers.google.cloud.hooks.gcs.GCSHook`
Expand Down Expand Up @@ -2053,7 +2053,7 @@ If the `AIRFLOW_CONFIG` environment variable was not set and the
will discover its config file using the `$AIRFLOW_CONFIG` and `$AIRFLOW_HOME`
environment variables rather than checking for the presence of a file.

### Changes in Google Cloud Platform related operators
### Changes in Google Cloud related operators

Most GCP-related operators have now optional `PROJECT_ID` parameter. In case you do not specify it,
the project id configured in
Expand All @@ -2080,7 +2080,7 @@ Operators involved:

Other GCP operators are unaffected.

### Changes in Google Cloud Platform related hooks
### Changes in Google Cloud related hooks

The change in GCP operators implies that GCP Hooks for those operators require now keyword parameters rather
than positional ones in all methods where `project_id` is used. The methods throw an explanatory exception
Expand Down Expand Up @@ -2148,7 +2148,7 @@ gct_hook.create_transfer_job(body)
```
The change results from the unification of all hooks and adjust to
[the official recommendations](https://lists.apache.org/thread.html/e8534d82be611ae7bcb21ba371546a4278aad117d5e50361fd8f14fe@%3Cdev.airflow.apache.org%3E)
for the Google Cloud Platform.
for the Google Cloud.

The signature of `wait_for_transfer_job` method in `GCPTransferServiceHook` has changed.

Expand Down Expand Up @@ -2765,7 +2765,7 @@ of user-editable configuration properties. See
All Google Cloud Operators and Hooks are aligned and use the same client library. Now you have a single connection
type for all kinds of Google Cloud Operators.

If you experience problems connecting with your operator make sure you set the connection type "Google Cloud Platform".
If you experience problems connecting with your operator make sure you set the connection type "Google Cloud".

Also the old P12 key file type is not supported anymore and only the new JSON key files are supported as a service
account.
Expand Down
6 changes: 3 additions & 3 deletions airflow/providers/amazon/aws/transfers/gcs_to_s3.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,10 +40,10 @@ class GCSToS3Operator(BaseOperator):
For e.g to lists the CSV files from in a directory in GCS you would use
delimiter='.csv'.
:type delimiter: str
:param gcp_conn_id: (Optional) The connection ID used to connect to Google Cloud Platform.
:param gcp_conn_id: (Optional) The connection ID used to connect to Google Cloud.
:type gcp_conn_id: str
:param google_cloud_storage_conn_id: (Deprecated) The connection ID used to connect to Google Cloud
Platform. This parameter has been deprecated. You should pass the gcp_conn_id parameter instead.
:param google_cloud_storage_conn_id: (Deprecated) The connection ID used to connect to Google Cloud.
This parameter has been deprecated. You should pass the gcp_conn_id parameter instead.
:type google_cloud_storage_conn_id: str
:param delegate_to: Google account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/google/ads/operators/ads.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ class GoogleAdsListAccountsOperator(BaseOperator):
:type bucket: str
:param object_name: GCS path to save the csv file. Must be the full file path (ex. `path/to/file.csv`)
:type object_name: str
:param gcp_conn_id: Airflow Google Cloud Platform connection ID
:param gcp_conn_id: Airflow Google Cloud connection ID
:type gcp_conn_id: str
:param google_ads_conn_id: Airflow Google Ads connection ID
:type google_ads_conn_id: str
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/google/ads/transfers/ads_to_gcs.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ class GoogleAdsToGcsOperator(BaseOperator):
:type bucket: str
:param obj: GCS path to save the object. Must be the full file path (ex. `path/to/file.txt`)
:type obj: str
:param gcp_conn_id: Airflow Google Cloud Platform connection ID
:param gcp_conn_id: Airflow Google Cloud connection ID
:type gcp_conn_id: str
:param google_ads_conn_id: Airflow Google Ads connection ID
:type google_ads_conn_id: str
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
This DAG relies on the following environment variables:
* GCP_PROJECT_ID - Google Cloud Platform project
* GCP_PROJECT_ID - Google Cloud project
* CBT_INSTANCE_ID - desired ID of a Cloud Bigtable instance
* CBT_INSTANCE_DISPLAY_NAME - desired human-readable display name of the Instance
* CBT_INSTANCE_TYPE - type of the Instance, e.g. 1 for DEVELOPMENT
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,11 +18,11 @@

"""
Example Airflow DAG that creates, patches and deletes a Cloud SQL instance, and also
creates, patches and deletes a database inside the instance, in Google Cloud Platform.
creates, patches and deletes a database inside the instance, in Google Cloud.
This DAG relies on the following OS environment variables
https://airflow.apache.org/concepts.html#variables
* GCP_PROJECT_ID - Google Cloud Platform project for the Cloud SQL instance.
* GCP_PROJECT_ID - Google Cloud project for the Cloud SQL instance.
* INSTANCE_NAME - Name of the Cloud SQL instance.
* DB_NAME - Name of the database inside a Cloud SQL instance.
"""
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
This DAG relies on the following OS environment variables
* GCP_PROJECT_ID - Google Cloud Platform project for the Cloud SQL instance
* GCP_PROJECT_ID - Google Cloud project for the Cloud SQL instance
* GCP_REGION - Google Cloud region where the database is created
*
* GCSQL_POSTGRES_INSTANCE_NAME - Name of the postgres Cloud SQL instance
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,8 @@
This DAG relies on the following OS environment variables
* GCP_PROJECT_ID - Google Cloud Platform project where the Compute Engine instance exists.
* GCE_ZONE - Google Cloud Platform zone where the instance exists.
* GCP_PROJECT_ID - Google Cloud project where the Compute Engine instance exists.
* GCE_ZONE - Google Cloud zone where the instance exists.
* GCE_INSTANCE - Name of the Compute Engine instance.
* GCE_SHORT_MACHINE_TYPE_NAME - Machine type resource name to set, e.g. 'n1-standard-1'.
See https://cloud.google.com/compute/docs/machine-types
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
This DAG relies on the following OS environment variables
* GCP_PROJECT_ID - the Google Cloud Platform project where the Compute Engine instance exists
* GCP_PROJECT_ID - the Google Cloud project where the Compute Engine instance exists
* GCE_ZONE - the zone where the Compute Engine instance exists
Variables for copy template operator:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
Example Airflow DAG that creates, updates, queries and deletes a Cloud Spanner instance.
This DAG relies on the following environment variables
* GCP_PROJECT_ID - Google Cloud Platform project for the Cloud Spanner instance.
* GCP_PROJECT_ID - Google Cloud project for the Cloud Spanner instance.
* GCP_SPANNER_INSTANCE_ID - Cloud Spanner instance ID.
* GCP_SPANNER_DATABASE_ID - Cloud Spanner database ID.
* GCP_SPANNER_CONFIG_NAME - The name of the instance's configuration. Values are of the
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
"""
Example Airflow DAG that creates, gets, lists, updates, purges, pauses, resumes
and deletes Queues and creates, gets, lists, runs and deletes Tasks in the Google
Cloud Tasks service in the Google Cloud Platform.
Cloud Tasks service in the Google Cloud.
"""


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@

"""
Example Airflow DAG that translates text in Google Cloud Translate
service in the Google Cloud Platform.
service in the Google Cloud.
"""

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@

"""
Example Airflow DAG that creates, gets, updates and deletes Products and Product Sets in the Google Cloud
Vision service in the Google Cloud Platform.
Vision service.
This DAG relies on the following OS environment variables
Expand Down
3 changes: 1 addition & 2 deletions airflow/providers/google/cloud/hooks/bigquery.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,8 +63,7 @@
# pylint: disable=too-many-public-methods
class BigQueryHook(GoogleBaseHook, DbApiHook):
"""
Interact with BigQuery. This hook uses the Google Cloud Platform
connection.
Interact with BigQuery. This hook uses the Google Cloud connection.
"""

conn_name_attr = 'gcp_conn_id' # type: str
Expand Down
10 changes: 5 additions & 5 deletions airflow/providers/google/cloud/hooks/bigtable.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ def get_instance(self, instance_id: str, project_id: str) -> Instance:
:param instance_id: The ID of the Cloud Bigtable instance.
:type instance_id: str
:param project_id: Optional, Google Cloud Platform project ID where the
:param project_id: Optional, Google Cloud project ID where the
BigTable exists. If set to None or missing,
the default project_id from the Google Cloud connection is used.
:type project_id: str
Expand All @@ -87,7 +87,7 @@ def delete_instance(self, instance_id: str, project_id: str) -> None:
Raises google.api_core.exceptions.NotFound if the Cloud Bigtable instance does
not exist.
:param project_id: Optional, Google Cloud Platform project ID where the
:param project_id: Optional, Google Cloud project ID where the
BigTable exists. If set to None or missing,
the default project_id from the Google Cloud connection is used.
:type project_id: str
Expand Down Expand Up @@ -130,7 +130,7 @@ def create_instance(
:param main_cluster_zone: The zone for main cluster.
See https://cloud.google.com/bigtable/docs/locations for more details.
:type project_id: str
:param project_id: Optional, Google Cloud Platform project ID where the
:param project_id: Optional, Google Cloud project ID where the
BigTable exists. If set to None or missing,
the default project_id from the Google Cloud connection is used.
:type replica_clusters: List[Dict[str, str]]
Expand Down Expand Up @@ -213,7 +213,7 @@ def update_instance(
:type instance_id: str
:param instance_id: The ID for the existing instance.
:type project_id: str
:param project_id: Optional, Google Cloud Platform project ID where the
:param project_id: Optional, Google Cloud project ID where the
BigTable exists. If set to None or missing,
the default project_id from the Google Cloud connection is used.
:type instance_display_name: str
Expand Down Expand Up @@ -283,7 +283,7 @@ def delete_table(self, instance_id: str, table_id: str, project_id: str) -> None
:type table_id: str
:param table_id: The ID of the table in Cloud Bigtable.
:type project_id: str
:param project_id: Optional, Google Cloud Platform project ID where the
:param project_id: Optional, Google Cloud project ID where the
BigTable exists. If set to None or missing,
the default project_id from the Google Cloud connection is used.
"""
Expand Down
6 changes: 3 additions & 3 deletions airflow/providers/google/cloud/hooks/cloud_sql.py
Original file line number Diff line number Diff line change
Expand Up @@ -416,7 +416,7 @@ class CloudSqlProxyRunner(LoggingMixin):
for UNIX socket connections and in the form of
``<project>:<region>:<instance>=tcp:<port>`` for TCP connections.
:type instance_specification: str
:param gcp_conn_id: Id of Google Cloud Platform connection to use for
:param gcp_conn_id: Id of Google Cloud connection to use for
authentication
:type gcp_conn_id: str
:param project_id: Optional id of the Google Cloud project to connect to - it overwrites
Expand Down Expand Up @@ -679,7 +679,7 @@ class CloudSQLDatabaseHook(BaseHook): # noqa
Remaining parameters are retrieved from the extras (URI query parameters):
* **project_id** - Optional, Google Cloud Platform project where the Cloud SQL
* **project_id** - Optional, Google Cloud project where the Cloud SQL
instance exists. If missing, default project id passed is used.
* **instance** - Name of the instance of the Cloud SQL database instance.
* **location** - The location of the Cloud SQL instance (for example europe-west1).
Expand All @@ -700,7 +700,7 @@ class CloudSQLDatabaseHook(BaseHook): # noqa
:param gcp_cloudsql_conn_id: URL of the connection
:type gcp_cloudsql_conn_id: str
:param gcp_conn_id: The connection ID used to connect to Google Cloud Platform for
:param gcp_conn_id: The connection ID used to connect to Google Cloud for
cloud-sql-proxy authentication.
:type gcp_conn_id: str
:param default_gcp_project_id: Default project id used if project_id not specified
Expand Down

0 comments on commit 2ca615c

Please sign in to comment.