Skip to content

Commit

Permalink
Separate out documentation building per provider (#12444)
Browse files Browse the repository at this point in the history
* POC

* fixup! POC
  • Loading branch information
mik-laj committed Nov 20, 2020
1 parent c3cf695 commit c34ef85
Show file tree
Hide file tree
Showing 259 changed files with 4,517 additions and 1,953 deletions.
16 changes: 15 additions & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -314,10 +314,24 @@ jobs:
run: ./scripts/ci/docs/ci_docs.sh --docs-only
- name: "Upload documentation"
uses: actions/upload-artifact@v2
if: always()
if: always() && github.event_name == 'pull_request'
with:
name: airflow-documentation
path: "./files/documentation"
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v1
if: >
github.ref == 'refs/heads/master' && github.repository == 'apache/airflow' &&
github.event_name == 'push'
with:
aws-access-key-id: ${{ secrets.DOCS_AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.DOCS_AWS_SECRET_ACCESS_KEY }}
aws-region: eu-central-1
- name: "Upload documentation to AWS S3"
if: >
github.ref == 'refs/heads/master' && github.repository == 'apache/airflow' &&
github.event_name == 'push'
run: aws s3 sync ./files/documentation s3://apache-airflow-docs

docs-spell-check:
timeout-minutes: 30
Expand Down
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -88,6 +88,8 @@ instance/
# Sphinx documentation
docs/_build/
docs/_api/
docs/*/_api/
docs/_doctrees

# PyBuilder
target/
Expand Down
1 change: 1 addition & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -457,6 +457,7 @@ repos:
entry: ./scripts/ci/pre_commit/pre_commit_check_provider_yaml_files.py
language: python
require_serial: true
files: provider.yaml$
additional_dependencies: ['PyYAML==5.3.1', 'jsonschema==3.2.0', 'tabulate==0.8.7']
- id: mermaid
name: Generate mermaid images
Expand Down
15 changes: 14 additions & 1 deletion CI.rst
Original file line number Diff line number Diff line change
Expand Up @@ -695,9 +695,22 @@ We also have a script that can help to clean-up the old artifacts:
CodeQL scan
-----------

The CodeQL security scan uses GitHub security scan framework to scan our code for security violations.
The `CodeQL <https://securitylab.github.com/tools/codeql>`_ security scan uses GitHub security scan framework to scan our code for security violations.
It is run for JavaScript and python code.

Publishing documentation
------------------------

Documentation from the ``master`` branch is automatically published on Amazon S3.

To make this possible, Github Action has secrets set up with credentials
for an Amazon Web Service account - ``DOCS_AWS_ACCESS_KEY_ID`` and ``DOCS_AWS_SECRET_ACCESS_KEY``.

This account has permission to write/list/put objects to bucket ``apache-airflow-docs``. This bucket has public access configured, which means it is accessible through the website endpoint. For more information, see: `Hosting a static website on Amazon S3
<https://docs.aws.amazon.com/AmazonS3/latest/dev/WebsiteHosting.html>`_

Website endpoint: http://apache-airflow-docs.s3-website.eu-central-1.amazonaws.com/

Naming conventions for stored images
====================================

Expand Down
44 changes: 3 additions & 41 deletions CONTRIBUTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -250,7 +250,7 @@ Step 4: Prepare PR

For example, to address this example issue, do the following:

* Read about `email configuration in Airflow <https://airflow.readthedocs.io/en/latest/howto/email-config.html>`__.
* Read about `email configuration in Airflow </docs/howto/email-config.rst>`__.

* Find the class you should modify. For the example GitHub issue,
this is `email.py <https://github.com/apache/airflow/blob/master/airflow/utils/email.py>`__.
Expand Down Expand Up @@ -713,47 +713,9 @@ jobs for each python version.
Documentation
=============

The latest API documentation (for the master branch) is usually available
`here <https://airflow.readthedocs.io/en/latest/>`__.
Documentation for ``apache-airflow`` package and other packages that are closely related to it ie. providers packages are in ``/docs/`` directory. For detailed information on documentation development, see: `docs/README.md <docs/README.md>`_

To generate a local version you can use `<BREEZE.rst>`_.

The documentation build consists of verifying consistency of documentation and two steps:

* spell checking
* building documentation

You can only run one of the steps via ``--spellcheck-only`` or ``--docs-only``.

.. code-block:: bash
./breeze build-docs
or just to run spell-check

.. code-block:: bash
./breeze build-docs -- --spellcheck-only
or just to run documentation building

.. code-block:: bash
./breeze build-docs -- --docs-only
Also documentation is available as downloadable artifact in GitHub Actions after the CI builds your PR.

**Known issues:**

If you are creating a new directory for new integration in the ``airflow.providers`` package,
you should also update the ``docs/autoapi_templates/index.rst`` file.

If you are creating new ``hooks``, ``sensors``, ``operators`` directory in
the ``airflow.providers`` package, you should also update
the ``docs/operators-and-hooks-ref.rst`` file.

If you are creating ``example_dags`` directory, you need to create ``example_dags/__init__.py`` with Apache
license or copy another ``__init__.py`` file that contains the necessary license.
For Helm Chart documentation, see: `/chart/README.md <../chart/READMe.md>`__

Static code checks
==================
Expand Down
3 changes: 1 addition & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,6 @@
[![PyPI version](https://badge.fury.io/py/apache-airflow.svg)](https://badge.fury.io/py/apache-airflow)
[![GitHub Build](https://github.com/apache/airflow/workflows/CI%20Build/badge.svg)](https://github.com/apache/airflow/actions)
[![Coverage Status](https://img.shields.io/codecov/c/github/apache/airflow/master.svg)](https://codecov.io/github/apache/airflow?branch=master)
[![Documentation Status](https://readthedocs.org/projects/airflow/badge/?version=latest)](https://airflow.readthedocs.io/en/latest/?badge=latest)
[![License](http://img.shields.io/:license-Apache%202-blue.svg)](http://www.apache.org/licenses/LICENSE-2.0.txt)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/apache-airflow.svg)](https://pypi.org/project/apache-airflow/)
[![Docker Pulls](https://img.shields.io/docker/pulls/apache/airflow.svg)](https://hub.docker.com/r/apache/airflow)
Expand Down Expand Up @@ -135,7 +134,7 @@ pip install apache-airflow[postgres,google]==1.10.12 \
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-1.10.12/constraints-3.7.txt"
```

For information on installing backport providers check https://airflow.readthedocs.io/en/latest/backport-providers.html.
For information on installing backport providers check [/docs/backport-providers.rst][/docs/backport-providers.rst].

## Official source code

Expand Down
4 changes: 2 additions & 2 deletions airflow/api_connexion/openapi/v1.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -171,7 +171,7 @@ info:
The default is to deny all requests.
For details on configuring the authentication, see
[API Authorization](https://airflow.readthedocs.io/en/latest/security/api.html).
[API Authorization](https://airflow.apache.org/docs/stable/security/api.html).
# Errors
Expand Down Expand Up @@ -1880,7 +1880,7 @@ components:
DAG details.
For details see:
(airflow.models.DAG)[https://airflow.readthedocs.io/en/stable/_api/airflow/models/index.html#airflow.models.DAG]
(airflow.models.DAG)[https://airflow.apache.org/docs/stable/_api/airflow/models/index.html#airflow.models.DAG]
allOf:
- $ref: '#/components/schemas/DAG'
- type: object
Expand Down
5 changes: 5 additions & 0 deletions airflow/provider.yaml.schema.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,10 @@
"description": "Package name available under which the package is available in the PyPI repository.",
"type": "string"
},
"name": {
"description": "Provider name",
"type": "string"
},
"description": {
"description": "Information about the package in RST format",
"type": "string"
Expand Down Expand Up @@ -167,6 +171,7 @@
},
"additionalProperties": false,
"required": [
"name",
"package-name",
"description",
"versions"
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/amazon/aws/hooks/base_aws.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
.. seealso::
For more information on how to use this hook, take a look at the guide:
:ref:`howto/connection:AWSHook`
:ref:`apache-airflow:howto/connection:AWSHook`
"""

import configparser
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/amazon/aws/operators/datasync.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ class AWSDataSyncOperator(BaseOperator):
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:AWSDataSyncOperator`
:ref:`apache-airflow:howto/operator:AWSDataSyncOperator`
.. note:: There may be 0, 1, or many existing DataSync Tasks defined in your AWS
environment. The default behavior is to create a new Task if there are 0, or
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/amazon/aws/operators/ecs.py
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ class ECSOperator(BaseOperator): # pylint: disable=too-many-instance-attributes
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:ECSOperator`
:ref:`apache-airflow:howto/operator:ECSOperator`
:param task_definition: the task definition name on Elastic Container Service
:type task_definition: str
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/amazon/aws/operators/glacier.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ class GlacierCreateJobOperator(BaseOperator):
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:GlacierCreateJobOperator`
:ref:`apache-airflow:howto/operator:GlacierCreateJobOperator`
:param aws_conn_id: The reference to the AWS connection details
:type aws_conn_id: str
Expand Down
4 changes: 4 additions & 0 deletions airflow/providers/amazon/aws/sensors/glacier.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,10 @@ class GlacierJobOperationSensor(BaseSensorOperator):
"""
Glacier sensor for checking job state. This operator runs only in reschedule mode.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`apache-airflow:howto/operator:GlacierJobOperationSensor`
:param aws_conn_id: The reference to the AWS connection details
:type aws_conn_id: str
:param vault_name: name of Glacier vault on which job is executed
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/amazon/aws/transfers/glacier_to_gcs.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ class GlacierToGCSOperator(BaseOperator):
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:GlacierToGCSOperator`
:ref:`apache-airflow:howto/operator:GlacierToGCSOperator`
:param aws_conn_id: The reference to the AWS connection details
:type aws_conn_id: str
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ class ImapAttachmentToS3Operator(BaseOperator):
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:ImapAttachmentToS3Operator`
:ref:`apache-airflow:howto/operator:ImapAttachmentToS3Operator`
:param imap_attachment_name: The file name of the mail attachment that you want to transfer.
:type imap_attachment_name: str
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/amazon/aws/transfers/s3_to_redshift.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ class S3ToRedshiftOperator(BaseOperator):
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:S3ToRedshiftOperator`
:ref:`apache-airflow:howto/operator:S3ToRedshiftOperator`
:param schema: reference to a specific schema in redshift database
:type schema: str
Expand Down
1 change: 1 addition & 0 deletions airflow/providers/amazon/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

---
package-name: apache-airflow-providers-amazon
name: Amazon
description: |
Amazon integration (including `Amazon Web Services (AWS) <https://aws.amazon.com/>`__).
Expand Down
1 change: 1 addition & 0 deletions airflow/providers/apache/cassandra/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

---
package-name: apache-airflow-providers-apache-cassandra
name: Apache Cassandra
description: |
`Apache Cassandra <http://cassandra.apache.org/>`__.
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/apache/cassandra/sensors/record.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ class CassandraRecordSensor(BaseSensorOperator):
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CassandraRecordSensor`
:ref:`apache-airflow:howto/operator:CassandraRecordSensor`
For example, if you want to wait for a record that has values 'v1' and 'v2' for each
primary keys 'p1' and 'p2' to be populated in keyspace 'k' and table 't',
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/apache/cassandra/sensors/table.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ class CassandraTableSensor(BaseSensorOperator):
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CassandraTableSensor`
:ref:`apache-airflow:howto/operator:CassandraTableSensor`
For example, if you want to wait for a table called 't' to be created
Expand Down
1 change: 1 addition & 0 deletions airflow/providers/apache/druid/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

---
package-name: apache-airflow-providers-apache-druid
name: Apache Druid
description: |
`Apache Druid <https://druid.apache.org/>`__.
Expand Down
1 change: 1 addition & 0 deletions airflow/providers/apache/hdfs/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

---
package-name: apache-airflow-providers-apache-hdfs
name: Apache HDFS
description: |
`Hadoop Distributed File System (HDFS) <https://hadoop.apache.org/docs/r1.2.1/hdfs_design.html>`__
and `WebHDFS <https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/WebHDFS.html>`__.
Expand Down
1 change: 1 addition & 0 deletions airflow/providers/apache/hive/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

---
package-name: apache-airflow-providers-apache-hive
name: Apache Hive
description: |
`Apache Hive <https://hive.apache.org/>`__
Expand Down
1 change: 1 addition & 0 deletions airflow/providers/apache/kylin/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

---
package-name: apache-airflow-providers-apache-kylin
name: Apache Hive
description: |
`Apache Kylin <https://kylin.apache.org/>`__
Expand Down
1 change: 1 addition & 0 deletions airflow/providers/apache/livy/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

---
package-name: apache-airflow-providers-apache-livy
name: Apache Livy
description: |
`Apache Livy <https://livy.apache.org/>`__
Expand Down
1 change: 1 addition & 0 deletions airflow/providers/apache/pig/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

---
package-name: apache-airflow-providers-apache-pig
name: Apache Pig
description: |
`Apache Pig <https://pig.apache.org/>`__
Expand Down
1 change: 1 addition & 0 deletions airflow/providers/apache/pinot/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

---
package-name: apache-airflow-providers-apache-pinot
name: Apache Pinot
description: |
`Apache Pinot <https://pinot.apache.org/>`__
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/apache/spark/operators/spark_jdbc.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ class SparkJDBCOperator(SparkSubmitOperator):
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:SparkJDBCOperator`
:ref:`apache-airflow:howto/operator:SparkJDBCOperator`
:param spark_app_name: Name of the job (default airflow-spark-jdbc)
:type spark_app_name: str
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/apache/spark/operators/spark_sql.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ class SparkSqlOperator(BaseOperator):
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:SparkSqlOperator`
:ref:`apache-airflow:howto/operator:SparkSqlOperator`
:param sql: The SQL query to execute. (templated)
:type sql: str
Expand Down
2 changes: 1 addition & 1 deletion airflow/providers/apache/spark/operators/spark_submit.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ class SparkSubmitOperator(BaseOperator):
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:SparkSubmitOperator`
:ref:`apache-airflow:howto/operator:SparkSubmitOperator`
:param application: The application that submitted as a job, either jar or py file. (templated)
:type application: str
Expand Down
1 change: 1 addition & 0 deletions airflow/providers/apache/spark/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

---
package-name: apache-airflow-providers-apache-spark
name: Apache Spark
description: |
`Apache Spark <https://spark.apache.org/>`__
Expand Down
1 change: 1 addition & 0 deletions airflow/providers/apache/sqoop/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

---
package-name: apache-airflow-providers-apache-sqoop
name: Apache Sqoop
description: |
`Apache Sqoop <https://sqoop.apache.org/>`__
Expand Down
1 change: 1 addition & 0 deletions airflow/providers/celery/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

---
package-name: apache-airflow-providers-celery
name: Celery
description: |
`Celery <http://www.celeryproject.org/>`__
Expand Down

0 comments on commit c34ef85

Please sign in to comment.