Skip to content

Commit

Permalink
Fix various typos in the repo (#10263)
Browse files Browse the repository at this point in the history
  • Loading branch information
kaxil committed Aug 9, 2020
1 parent 5503a6a commit b43f90a
Show file tree
Hide file tree
Showing 10 changed files with 17 additions and 16 deletions.
2 changes: 1 addition & 1 deletion airflow/api_connexion/schemas/health_schema.py
Expand Up @@ -19,7 +19,7 @@


class BaseInfoSchema(Schema):
"""Base status field for metadabase and scheduler"""
"""Base status field for metadatabase and scheduler"""
status = fields.String(dump_only=True)


Expand Down
2 changes: 1 addition & 1 deletion airflow/models/baseoperator.py
Expand Up @@ -891,7 +891,7 @@ def render_template( # pylint: disable=too-many-return-statements
if not jinja_env:
jinja_env = self.get_template_env()

# Imported here to avoid ciruclar dependency
# Imported here to avoid circular dependency
from airflow.models.xcom_arg import XComArg

if isinstance(content, str):
Expand Down
4 changes: 2 additions & 2 deletions airflow/models/dag.py
Expand Up @@ -1005,11 +1005,11 @@ def clear(
:type dry_run: bool
:param session: The sqlalchemy session to use
:type session: sqlalchemy.orm.session.Session
:param get_tis: Return the sqlachemy query for finding the TaskInstance without clearing the tasks
:param get_tis: Return the sqlalchemy query for finding the TaskInstance without clearing the tasks
:type get_tis: bool
:param recursion_depth: The recursion depth of nested calls to DAG.clear().
:type recursion_depth: int
:param max_recursion_depth: The maximum recusion depth allowed. This is determined by the
:param max_recursion_depth: The maximum recursion depth allowed. This is determined by the
first encountered ExternalTaskMarker. Default is None indicating no ExternalTaskMarker
has been encountered.
:type max_recursion_depth: int
Expand Down
6 changes: 3 additions & 3 deletions airflow/models/dagcode.py
Expand Up @@ -97,11 +97,11 @@ def bulk_sync_to_db(cls, filelocs: Iterable[str], session=None):
existing_orm_dag_codes_by_fileloc_hashes = {
orm.fileloc_hash: orm for orm in existing_orm_dag_codes
}
exisitng_orm_filelocs = {
existing_orm_filelocs = {
orm.fileloc for orm in existing_orm_dag_codes_by_fileloc_hashes.values()
}
if not exisitng_orm_filelocs.issubset(filelocs):
conflicting_filelocs = exisitng_orm_filelocs.difference(filelocs)
if not existing_orm_filelocs.issubset(filelocs):
conflicting_filelocs = existing_orm_filelocs.difference(filelocs)
hashes_to_filelocs = {
DagCode.dag_fileloc_hash(fileloc): fileloc for fileloc in filelocs
}
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/apache/hive/hooks/hive.py
Expand Up @@ -107,7 +107,7 @@ def __init__(

def _get_proxy_user(self) -> str:
"""
This function set the proper proxy_user value in case the user overwtire the default.
This function set the proper proxy_user value in case the user overwrite the default.
"""
conn = self.conn

Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/apache/kylin/hooks/kylin.py
Expand Up @@ -55,7 +55,7 @@ def get_conn(self):

def cube_run(self, datasource_name, op, **op_args):
"""
run CubeSource command whitch in CubeSource.support_invoke_command
run CubeSource command which in CubeSource.support_invoke_command
:param datasource_name:
:param op: command
:param op_args: command args
Expand Down
4 changes: 2 additions & 2 deletions airflow/providers/google/cloud/hooks/dlp.py
Expand Up @@ -143,13 +143,13 @@ def create_deidentify_template(
de-identifying content, images, and storage.
:param organization_id: (Optional) The organization ID. Required to set this
field if parent resource is an organzation.
field if parent resource is an organization.
:type organization_id: str
:param project_id: (Optional) Google Cloud Platform project ID where the
DLP Instance exists. Only set this field if the parent resource is
a project instead of an organzation.
:type project_id: str
:param deidentify_template: (Optional) The deidentify template to create.
:param deidentify_template: (Optional) The de-identify template to create.
:type deidentify_template: dict or google.cloud.dlp_v2.types.DeidentifyTemplate
:param template_id: (Optional) The template ID.
:type template_id: str
Expand Down
2 changes: 1 addition & 1 deletion backport_packages/refactor_backport_packages.py
Expand Up @@ -79,7 +79,7 @@ def copy_helper_py_file(target_file_path: str) -> None:
The helper has two methods (chain, cross_downstream) that are moved from the original helper to
'airflow.models.baseoperator'. so in 1.10 they should reimport the original 'airflow.utils.helper'
methods. Those deprecated methods use importe with import_string("<IMPORT>") so it is easier to
methods. Those deprecated methods use import with import_string("<IMPORT>") so it is easier to
replace them as strings rather than with Bowler
:param target_file_path: target path name for the helpers.py
Expand Down
4 changes: 2 additions & 2 deletions docs/concepts.rst
Expand Up @@ -243,7 +243,7 @@ The decorated function can be called once to set the arguments and key arguments
Task decorator captures returned values and sends them to the :ref:`XCom backend <concepts:xcom>`. By default, returned
value is saved as a single XCom value. You can set ``multiple_outputs`` key argument to ``True`` to unroll dictionaries,
lists or tuples into seprate XCom values. This can be used with regular operators to create
lists or tuples into separate XCom values. This can be used with regular operators to create
:ref:`functional DAGs <concepts:functional_dags>`.

Calling a decorated function returns an ``XComArg`` instance. You can use it to set templated fields on downstream
Expand Down Expand Up @@ -743,7 +743,7 @@ Custom XCom backend

It is possible to change ``XCom`` behaviour os serialization and deserialization of tasks' result.
To do this one have to change ``xcom_backend`` parameter in Airflow config. Provided value should point
to a class that is subclass of :class:`~airflow.models.xcom.BaseXCom`. To alter the serialaization /
to a class that is subclass of :class:`~airflow.models.xcom.BaseXCom`. To alter the serialization /
deserialization mechanism the custom class should override ``serialize_value`` and ``deserialize_value``
methods.

Expand Down
5 changes: 3 additions & 2 deletions docs/howto/write-logs.rst
Expand Up @@ -264,7 +264,8 @@ To output task logs to stdout in JSON format, the following config could be used
Writing Logs to Elasticsearch over TLS
----------------------------------------

To add custom configurations to ElasticSearch (e.g. turning on ``ssl_verify``, adding a custom self-signed cert, etc.) use the ``elasticsearch_configs`` setting in your ``airfow.cfg``
To add custom configurations to ElasticSearch (e.g. turning on ``ssl_verify``, adding a custom self-signed
cert, etc.) use the ``elasticsearch_configs`` setting in your ``airflow.cfg``

.. code-block:: ini
Expand Down Expand Up @@ -312,7 +313,7 @@ logs under the name ``airflow-tasks``.

You can set ``stackdriver_key_path`` option in the ``[logging]`` section to specify the path to `the service
account key file <https://cloud.google.com/iam/docs/service-accounts>`__.
If ommited, authorization based on `the Application Default Credentials
If omitted, authorization based on `the Application Default Credentials
<https://cloud.google.com/docs/authentication/production#finding_credentials_automatically>`__ will
be used.

Expand Down

0 comments on commit b43f90a

Please sign in to comment.