Skip to content

Commit

Permalink
Fix spellings (#14483)
Browse files Browse the repository at this point in the history
  • Loading branch information
jbampton committed Feb 27, 2021
1 parent afb3432 commit 50a1504
Show file tree
Hide file tree
Showing 55 changed files with 100 additions and 100 deletions.
4 changes: 2 additions & 2 deletions BREEZE.rst
Original file line number Diff line number Diff line change
Expand Up @@ -333,7 +333,7 @@ Managing CI environment:
* Generate constraints with ``breeze generate-constraints``
* Execute arbitrary command in the test environment with ``breeze shell`` command
* Execute arbitrary docker-compose command with ``breeze docker-compose`` command
* Push docker images with ``breeze push-image`` command (require committer's rights to push images)
* Push docker images with ``breeze push-image`` command (require committers rights to push images)

You can optionally reset the Airflow metada database if specified as extra ``--db-reset`` flag and for CI image
you can also start integrations (separate Docker images) if specified as extra ``--integration`` flags. You can also
Expand All @@ -356,7 +356,7 @@ Managing Prod environment (with ``--production-image`` flag):
* Restart running interactive environment with ``breeze restart`` command
* Execute arbitrary command in the test environment with ``breeze shell`` command
* Execute arbitrary docker-compose command with ``breeze docker-compose`` command
* Push docker images with ``breeze push-image`` command (require committer's rights to push images)
* Push docker images with ``breeze push-image`` command (require committers rights to push images)

You can optionally reset database if specified as extra ``--db-reset`` flag. You can also
chose which backend database should be used with ``--backend`` flag and python version with ``--python`` flag.
Expand Down
2 changes: 1 addition & 1 deletion CONTRIBUTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -983,7 +983,7 @@ If this function is designed to be called by "end-users" (i.e. DAG authors) then
...
# You SHOULD not commit the session here. The wrapper will take care of commit()/rollback() if exception
Don't use time() for duration calcuations
Don't use time() for duration calculations
-----------------------------------------

If you wish to compute the time difference between two events with in the same process, use
Expand Down
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -256,7 +256,7 @@ ENV AIRFLOW_INSTALLATION_METHOD=${AIRFLOW_INSTALLATION_METHOD}
ARG AIRFLOW_VERSION_SPECIFICATION=""
ENV AIRFLOW_VERSION_SPECIFICATION=${AIRFLOW_VERSION_SPECIFICATION}

# We can seet this value to true in case we want to install .whl .tar.gz packages placed in the
# We can set this value to true in case we want to install .whl .tar.gz packages placed in the
# docker-context-files folder. This can be done for both - additional packages you want to install
# and for airflow as well (you have to set INSTALL_FROM_PYPI to false in this case)
ARG INSTALL_FROM_DOCKER_CONTEXT_FILES=""
Expand Down
2 changes: 1 addition & 1 deletion IMAGES.rst
Original file line number Diff line number Diff line change
Expand Up @@ -228,7 +228,7 @@ Choosing image registry
=======================

By default images are pulled and pushed from and to DockerHub registry when you use Breeze's push-image
or build commands. But as described in `CI Documentaton <CI.rst>`_, you can choose different image
or build commands. But as described in `CI Documentation <CI.rst>`_, you can choose different image
registry by setting ``GITHUB_REGISTRY`` to ``docker.pkg.github.com`` for Github Package Registry or
``ghcr.io`` for GitHub Container Registry.

Expand Down
4 changes: 2 additions & 2 deletions UPDATING.md
Original file line number Diff line number Diff line change
Expand Up @@ -780,7 +780,7 @@ In previous versions, the `LatestOnlyOperator` forcefully skipped all (direct an

No change is needed if only the default trigger rule `all_success` is being used.

If the DAG relies on tasks with other trigger rules (i.e. `all_done`) being skipped by the `LatestOnlyOperator`, adjustments to the DAG need to be made to commodate the change in behaviour, i.e. with additional edges from the `LatestOnlyOperator`.
If the DAG relies on tasks with other trigger rules (i.e. `all_done`) being skipped by the `LatestOnlyOperator`, adjustments to the DAG need to be made to accommodate the change in behaviour, i.e. with additional edges from the `LatestOnlyOperator`.

The goal of this change is to achieve a more consistent and configurale cascading behaviour based on the `BaseBranchOperator` (see [AIRFLOW-2923](https://jira.apache.org/jira/browse/AIRFLOW-2923) and [AIRFLOW-1784](https://jira.apache.org/jira/browse/AIRFLOW-1784)).

Expand Down Expand Up @@ -1662,7 +1662,7 @@ ImapHook:
#### `airflow.providers.http.hooks.http.HttpHook`

The HTTPHook is now secured by default: `verify=True` (before: `verify=False`)
This can be overwriten by using the extra_options param as `{'verify': False}`.
This can be overwritten by using the extra_options param as `{'verify': False}`.

#### `airflow.providers.cloudant.hooks.cloudant.CloudantHook`

Expand Down
2 changes: 1 addition & 1 deletion airflow/cli/commands/webserver_command.py
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,7 @@ def _kill_old_workers(self, count: int) -> None:

def _reload_gunicorn(self) -> None:
"""
Send signal to reload the gunciron configuration. When gunciorn receive signals, it reload the
Send signal to reload the gunicorn configuration. When gunicorn receive signals, it reload the
configuration, start the new worker processes with a new configuration and gracefully
shutdown older workers.
"""
Expand Down
2 changes: 1 addition & 1 deletion airflow/executors/celery_executor.py
Original file line number Diff line number Diff line change
Expand Up @@ -363,7 +363,7 @@ def _check_for_stalled_adopted_tasks(self):
# If the task gets updated to STARTED (which Celery does) or has
# already finished, then it will be removed from this list -- so
# the only time it's still in this list is when it a) never made it
# to celery in the first place (i.e. race condition somehwere in
# to celery in the first place (i.e. race condition somewhere in
# the dying executor) or b) a really long celery queue and it just
# hasn't started yet -- better cancel it and let the scheduler
# re-queue rather than have this task risk stalling for ever
Expand Down
2 changes: 1 addition & 1 deletion airflow/models/dag.py
Original file line number Diff line number Diff line change
Expand Up @@ -1400,7 +1400,7 @@ def clear_dags(
return count

def __deepcopy__(self, memo):
# Swiwtcharoo to go around deepcopying objects coming through the
# Switcharoo to go around deepcopying objects coming through the
# backdoor
cls = self.__class__
result = cls.__new__(cls)
Expand Down
4 changes: 2 additions & 2 deletions airflow/models/dagbag.py
Original file line number Diff line number Diff line change
Expand Up @@ -525,7 +525,7 @@ def sync_to_db(self, session: Optional[Session] = None):
from airflow.models.dag import DAG
from airflow.models.serialized_dag import SerializedDagModel

def _serialze_dag_capturing_errors(dag, session):
def _serialize_dag_capturing_errors(dag, session):
"""
Try to serialize the dag to the DB, but make a note of any errors.
Expand Down Expand Up @@ -561,7 +561,7 @@ def _serialze_dag_capturing_errors(dag, session):
try:
# Write Serialized DAGs to DB, capturing errors
for dag in self.dags.values():
serialize_errors.extend(_serialze_dag_capturing_errors(dag, session))
serialize_errors.extend(_serialize_dag_capturing_errors(dag, session))

DAG.bulk_write_to_db(self.dags.values(), session=session)
except OperationalError:
Expand Down
4 changes: 2 additions & 2 deletions airflow/models/dagrun.py
Original file line number Diff line number Diff line change
Expand Up @@ -440,7 +440,7 @@ def update_state(
msg='task_failure',
)

# if all leafs succeeded and no unfinished tasks, the run succeeded
# if all leaves succeeded and no unfinished tasks, the run succeeded
elif not unfinished_tasks and all(leaf_ti.state in State.success_states for leaf_ti in leaf_tis):
self.log.info('Marking run %s successful', self)
self.set_state(State.SUCCESS)
Expand Down Expand Up @@ -592,7 +592,7 @@ def _emit_true_scheduling_delay_stats_for_finished_state(self, finished_tis):
dag = self.get_dag()

if not self.dag.schedule_interval or self.dag.schedule_interval == "@once":
# We can't emit this metric if there is no following schedule to cacluate from!
# We can't emit this metric if there is no following schedule to calculate from!
return

ordered_tis_by_start_date = [ti for ti in finished_tis if ti.start_date]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@


# [START howto_configure_simple_action_pipeline]
SIMPLE_ACTION_PIEPELINE = {
SIMPLE_ACTION_PIPELINE = {
"pipeline": {
"actions": [
{"imageUri": "bash", "commands": ["-c", "echo Hello, world"]},
Expand Down Expand Up @@ -83,7 +83,7 @@
# [START howto_run_pipeline]
simple_life_science_action_pipeline = LifeSciencesRunPipelineOperator(
task_id='simple-action-pipeline',
body=SIMPLE_ACTION_PIEPELINE,
body=SIMPLE_ACTION_PIPELINE,
project_id=PROJECT_ID,
location=LOCATION,
)
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/google/cloud/operators/compute.py
Original file line number Diff line number Diff line change
Expand Up @@ -355,7 +355,7 @@ def execute(self, context) -> None:
dict(name="onHostMaintenance", optional=True),
dict(name="automaticRestart", optional=True),
dict(name="preemptible", optional=True),
dict(name="nodeAffinitites", optional=True), # not validating deeper
dict(name="nodeAffinities", optional=True), # not validating deeper
],
),
dict(name="labels", optional=True),
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/google/cloud/transfers/sql_to_gcs.py
Original file line number Diff line number Diff line change
Expand Up @@ -263,7 +263,7 @@ def _configure_parquet_file(self, file_handle, parquet_schema):

def _convert_parquet_schema(self, cursor):
type_map = {
'INTERGER': pa.int64(),
'INTEGER': pa.int64(),
'FLOAT': pa.float64(),
'NUMERIC': pa.float64(),
'BIGNUMERIC': pa.float64(),
Expand Down
4 changes: 2 additions & 2 deletions airflow/providers/google/common/utils/id_token_credentials.py
Original file line number Diff line number Diff line change
Expand Up @@ -205,8 +205,8 @@ def get_default_id_token_credentials(
if __name__ == "__main__":
from google.auth.transport import requests

request_adaapter = requests.Request()
request_adapter = requests.Request()

creds = get_default_id_token_credentials(target_audience=None)
creds.refresh(request=request_adaapter)
creds.refresh(request=request_adapter)
print(creds.token)
2 changes: 1 addition & 1 deletion airflow/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -435,7 +435,7 @@ def initialize():
configure_orm()
configure_action_logging()

# Ensure we close DB connections at scheduler and gunicon worker terminations
# Ensure we close DB connections at scheduler and gunicorn worker terminations
atexit.register(dispose_orm)


Expand Down
2 changes: 1 addition & 1 deletion airflow/utils/db.py
Original file line number Diff line number Diff line change
Expand Up @@ -256,7 +256,7 @@ def create_default_connections(session=None):
conn_id="facebook_default",
conn_type="facebook_social",
extra="""
{ "account_id": "<AD_ACCOUNNT_ID>",
{ "account_id": "<AD_ACCOUNT_ID>",
"app_id": "<FACEBOOK_APP_ID>",
"app_secret": "<FACEBOOK_APP_SECRET>",
"access_token": "<FACEBOOK_AD_ACCESS_TOKEN>"
Expand Down
2 changes: 1 addition & 1 deletion airflow/www/static/js/datetime_utils.js
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ export function updateAllDateTimes() {
const dt = moment($el.attr('datetime'));
$el.text(dt.format(defaultFormat));
if ($el.attr('title') !== undefined) {
// If displayed date is not UTC, have the UTC date in a title attriubte
// If displayed date is not UTC, have the UTC date in a title attribute
$el.attr('title', dt.isUTC() ? '' : `UTC: ${dt.clone().utc().format()}`);
}
});
Expand Down
6 changes: 3 additions & 3 deletions airflow/www/static/js/gantt_chart_d3v2.js
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ d3.gantt = function() {
return "translate(" + (x(d.start_date.valueOf()) + yAxisLeftOffset) + "," + y(d.task_id) + ")";
};

function tickFormater(d) {
function tickFormatter(d) {
// We can't use d3.time.format as that uses local time, so instead we use
// moment as that handles our "global" timezone.
return moment(d).strftime(tickFormat);
Expand All @@ -129,7 +129,7 @@ d3.gantt = function() {

var y = d3.scale.ordinal().domain(taskTypes).rangeRoundBands([ 0, height - margin.top - margin.bottom ], .1);

var xAxis = d3.svg.axis().scale(x).orient("bottom").tickFormat(tickFormater).tickSubdivide(true)
var xAxis = d3.svg.axis().scale(x).orient("bottom").tickFormat(tickFormatter).tickSubdivide(true)
.tickSize(8).tickPadding(8);

var yAxis = d3.svg.axis().scale(y).orient("left").tickSize(0);
Expand Down Expand Up @@ -157,7 +157,7 @@ d3.gantt = function() {
var initAxis = function() {
x = d3.time.scale().domain([ timeDomainStart, timeDomainEnd ]).range([ 0, width-yAxisLeftOffset ]).clamp(true);
y = d3.scale.ordinal().domain(taskTypes).rangeRoundBands([ 0, height - margin.top - margin.bottom ], .1);
xAxis = d3.svg.axis().scale(x).orient("bottom").tickFormat(tickFormater).tickSubdivide(true)
xAxis = d3.svg.axis().scale(x).orient("bottom").tickFormat(tickFormatter).tickSubdivide(true)
.tickSize(8).tickPadding(8);

yAxis = d3.svg.axis().scale(y).orient("left").tickSize(0);
Expand Down
4 changes: 2 additions & 2 deletions breeze
Original file line number Diff line number Diff line change
Expand Up @@ -1024,7 +1024,7 @@ function breeze::parse_arguments() {
echo "Additional apt dev dependencies: ${ADDITIONAL_DEV_APT_DEPS}"
shift 2
;;
--dev-apt-commad)
--dev-apt-command)
export DEV_APT_COMMAND="${2}"
echo "Apt dev command: ${DEV_APT_COMMAND}"
shift 2
Expand All @@ -1049,7 +1049,7 @@ function breeze::parse_arguments() {
echo "Additional apt runtime dependencies: ${ADDITIONAL_RUNTIME_APT_DEPS}"
shift 2
;;
--runtime-apt-commad)
--runtime-apt-command)
export RUNTIME_APT_COMMAND="${2}"
echo "Apt runtime command: ${RUNTIME_APT_COMMAND}"
shift 2
Expand Down
4 changes: 2 additions & 2 deletions docs/exts/docs_build/docs_builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -150,8 +150,8 @@ def check_spelling(self, verbose):
)
warning_text = ""
for filepath in glob(f"{tmp_dir}/**/*.spelling", recursive=True):
with open(filepath) as speeling_file:
warning_text += speeling_file.read()
with open(filepath) as spelling_file:
warning_text += spelling_file.read()

spelling_errors.extend(parse_spelling_warnings(warning_text, self._src_dir))
return spelling_errors
Expand Down
2 changes: 1 addition & 1 deletion docs/exts/docs_build/github_action_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
def with_group(title):
"""
If used in GitHub Action, creates an expandable group in the GitHub Action log.
Otherwise, dispaly simple text groups.
Otherwise, display simple text groups.
For more information, see:
https://docs.github.com/en/free-pro-team@latest/actions/reference/workflow-commands-for-github-actions#grouping-log-lines
Expand Down
2 changes: 1 addition & 1 deletion scripts/ci/libraries/_push_pull_remove_images.sh
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ function push_pull_remove_images::pull_image_github_dockerhub() {
set -e
}

# Pulls the base Python image. This image is used as base for CI and PROD imaages, depending on the parameters used:
# Pulls the base Python image. This image is used as base for CI and PROD images, depending on the parameters used:
#
# * if UPGRADE_TO_NEWER_DEPENDENCIES is noy false, then it pulls the latest Python image available first and
# adds `org.opencontainers.image.source` label to it, so that it is linked to Airflow repository when
Expand Down
2 changes: 1 addition & 1 deletion tests/api/common/experimental/test_mark_tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -592,7 +592,7 @@ def test_set_state_without_commit(self):

will_be_altered = set_dag_run_state_to_failed(self.dag1, date, commit=False)

# Only the running task shouldbe altered.
# Only the running task should be altered.
expected = self._get_num_tasks_with_starting_state(State.RUNNING, inclusion=True)
assert len(will_be_altered) == expected
self._verify_dag_run_state(self.dag1, date, State.RUNNING)
Expand Down
2 changes: 1 addition & 1 deletion tests/api_connexion/test_error_handling.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ def test_incorrect_endpoint_should_return_json(self):

assert 404 == resp_json["status"]

# When we are hitting non-api incorrect enpoint
# When we are hitting non-api incorrect endpoint

resp_json = self.client.get("/incorrect_endpoint").json

Expand Down
8 changes: 4 additions & 4 deletions tests/cli/commands/test_task_command.py
Original file line number Diff line number Diff line change
Expand Up @@ -237,10 +237,10 @@ def test_task_states_for_dag_run(self):

dag2 = DagBag().dags['example_python_operator']
task2 = dag2.get_task(task_id='print_the_context')
defaut_date2 = timezone.make_aware(datetime(2016, 1, 9))
default_date2 = timezone.make_aware(datetime(2016, 1, 9))
dag2.clear()

ti2 = TaskInstance(task2, defaut_date2)
ti2 = TaskInstance(task2, default_date2)

ti2.set_state(State.SUCCESS)
ti_start = ti2.start_date
Expand All @@ -253,7 +253,7 @@ def test_task_states_for_dag_run(self):
'tasks',
'states-for-dag-run',
'example_python_operator',
defaut_date2.isoformat(),
default_date2.isoformat(),
'--output',
"json",
]
Expand Down Expand Up @@ -352,7 +352,7 @@ def tearDown(self) -> None:
def assert_log_line(self, text, logs_list, expect_from_logging_mixin=False):
"""
Get Log Line and assert only 1 Entry exists with the given text. Also check that
"logging_mixin" line does not appear in that log line to avoid duplicate loggigng as below:
"logging_mixin" line does not appear in that log line to avoid duplicate logging as below:
[2020-06-24 16:47:23,537] {logging_mixin.py:91} INFO - [2020-06-24 16:47:23,536] {python.py:135}
"""
Expand Down
6 changes: 3 additions & 3 deletions tests/jobs/test_local_task_job.py
Original file line number Diff line number Diff line change
Expand Up @@ -355,7 +355,7 @@ def task_function(ti):
job1 = LocalTaskJob(task_instance=ti, ignore_ti_state=True, executor=SequentialExecutor())
with timeout(30):
# This should be _much_ shorter to run.
# If you change this limit, make the timeout in the callbable above bigger
# If you change this limit, make the timeout in the callable above bigger
job1.run()

ti.refresh_from_db()
Expand Down Expand Up @@ -422,7 +422,7 @@ def dummy_return_code(*args, **kwargs):

with timeout(10):
# This should be _much_ shorter to run.
# If you change this limit, make the timeout in the callbable above bigger
# If you change this limit, make the timeout in the callable above bigger
job1.run()

ti.refresh_from_db()
Expand All @@ -431,7 +431,7 @@ def dummy_return_code(*args, **kwargs):

def test_mark_success_on_success_callback(self):
"""
Test that ensures that where a task is marked suceess in the UI
Test that ensures that where a task is marked success in the UI
on_success_callback gets executed
"""
# use shared memory value so we can properly track value change even if
Expand Down
2 changes: 1 addition & 1 deletion tests/jobs/test_scheduler_job.py
Original file line number Diff line number Diff line change
Expand Up @@ -2359,7 +2359,7 @@ def test_dagrun_root_fail_unfinished(self):
dag.run(start_date=dr.execution_date, end_date=dr.execution_date, executor=self.null_exec)

# Mark the successful task as never having run since we want to see if the
# dagrun will be in a running state despite haveing an unfinished task.
# dagrun will be in a running state despite having an unfinished task.
with create_session() as session:
ti = dr.get_task_instance('test_dagrun_unfinished', session=session)
ti.state = State.NONE
Expand Down
2 changes: 1 addition & 1 deletion tests/kubernetes/test_pod_generator.py
Original file line number Diff line number Diff line change
Expand Up @@ -293,7 +293,7 @@ def test_from_obj(self):
}
assert (
result_from_pod == expected_from_pod
), "There was a discrepency between KubernetesExecutor and pod_override"
), "There was a discrepancy between KubernetesExecutor and pod_override"

assert {
'apiVersion': 'v1',
Expand Down
4 changes: 2 additions & 2 deletions tests/models/test_dag.py
Original file line number Diff line number Diff line change
Expand Up @@ -717,7 +717,7 @@ def test_bulk_write_to_db_max_active_runs(self):

model = session.query(DagModel).get((dag.dag_id,))
assert model.next_dagrun == period_end
# We signle "at max active runs" by saying this run is never eligible to be created
# We signal "at max active runs" by saying this run is never eligible to be created
assert model.next_dagrun_create_after is None

def test_sync_to_db(self):
Expand Down Expand Up @@ -1042,7 +1042,7 @@ def test_dag_handle_callback_crash(self, mock_stats):
dag.add_task(BaseOperator(task_id="faketastic", owner='Also fake', start_date=when))

dag_run = dag.create_dagrun(State.RUNNING, when, run_type=DagRunType.MANUAL)
# should not rause any exception
# should not raise any exception
dag.handle_callback(dag_run, success=False)
dag.handle_callback(dag_run, success=True)

Expand Down
2 changes: 1 addition & 1 deletion tests/models/test_dagcode.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ def _compare_example_dags(self, example_dags):
def test_code_can_be_read_when_no_access_to_file(self):
"""
Test that code can be retrieved from DB when you do not have access to Code file.
Source Code should atleast exist in one of DB or File.
Source Code should at least exist in one of DB or File.
"""
example_dag = make_example_dags(example_dags_module).get('example_bash_operator')
example_dag.sync_to_db()
Expand Down

0 comments on commit 50a1504

Please sign in to comment.