Skip to content

Commit c5dfb11

Browse files
authored
Use explicit directives instead of implicit syntax (#50870)
1 parent 3e4104b commit c5dfb11

File tree

8 files changed

+23
-18
lines changed

8 files changed

+23
-18
lines changed

airflow-core/docs/best-practices.rst

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -296,8 +296,6 @@ When you execute that code you will see:
296296
297297
This means that the ``get_array`` is not executed as top-level code, but ``get_task_id`` is.
298298

299-
.. _best_practices/dynamic_dag_generation:
300-
301299
Code Quality and Linting
302300
------------------------
303301

@@ -351,6 +349,7 @@ By integrating ``ruff`` into your development workflow, you can proactively addr
351349

352350
For more information on ``ruff`` and its integration with Airflow, refer to the `official Airflow documentation <https://airflow.apache.org/docs/apache-airflow/stable/best-practices.html>`_.
353351

352+
.. _best_practices/dynamic_dag_generation:
354353

355354
Dynamic DAG Generation
356355
----------------------

airflow-core/docs/howto/docker-compose/index.rst

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -307,11 +307,13 @@ Examples of how you can extend the image with custom providers, python packages,
307307
apt packages and more can be found in :doc:`Building the image <docker-stack:build>`.
308308

309309
.. note::
310-
Creating custom images means that you need to maintain also a level of automation as you need to re-create the images
311-
when either the packages you want to install or Airflow is upgraded. Please do not forget about keeping these scripts.
312-
Also keep in mind, that in cases when you run pure Python tasks, you can use the
313-
`Python Virtualenv functions <_howto/operator:PythonVirtualenvOperator>`_ which will
314-
dynamically source and install python dependencies during runtime. With Airflow 2.8.0 Virtualenvs can also be cached.
310+
Creating custom images means that you need to maintain also a level of
311+
automation as you need to re-create the images when either the packages you
312+
want to install or Airflow is upgraded. Please do not forget about keeping
313+
these scripts. Also keep in mind, that in cases when you run pure Python
314+
tasks, you can use :ref:`Python Virtualenv functions <howto/operator:PythonVirtualenvOperator>`,
315+
which will dynamically source and install python dependencies during runtime.
316+
With Airflow 2.8.0, virtualenvs can also be cached.
315317

316318
Special case - adding dependencies via requirements.txt file
317319
============================================================

airflow-core/docs/howto/dynamic-dag-generation.rst

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,8 @@ If you want to use variables to configure your code, you should always use
4040
`environment variables <https://wiki.archlinux.org/title/environment_variables>`_ in your
4141
top-level code rather than :doc:`Airflow Variables </core-concepts/variables>`. Using Airflow Variables
4242
in top-level code creates a connection to the metadata DB of Airflow to fetch the value, which can slow
43-
down parsing and place extra load on the DB. See the `best practices on Airflow Variables <best_practice:airflow_variables>`_
43+
down parsing and place extra load on the DB. See
44+
:ref:`best practices on Airflow Variables <best_practices/airflow_variables>`
4445
to make the best use of Airflow Variables in your dags using Jinja templates.
4546

4647
For example you could set ``DEPLOYMENT`` variable differently for your production and development

airflow-core/docs/installation/upgrading_to_airflow3.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,7 @@ Some changes can be automatically fixed. To do so, run the following command:
7171
ruff check dag/ --select AIR301 --fix --preview
7272
7373
74-
You can also configure these flags through configuration files. See `Configuring Ruff <Configuring Ruff>`_ for details.
74+
You can also configure these flags through configuration files. See `Configuring Ruff <https://docs.astral.sh/ruff/configuration/>`_ for details.
7575

7676
Step 4: Install the Standard Providers
7777
--------------------------------------

airflow-core/docs/public-airflow-interface.rst

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -46,9 +46,9 @@ MAJOR version of Airflow. On the other hand, classes and methods starting with `
4646
as protected Python methods) and ``__`` (also known as private Python methods) are not part of the Public
4747
Airflow Interface and might change at any time.
4848

49-
You can also use Airflow's Public Interface via the `Stable REST API <stable-rest-api-ref>`_ (based on the
49+
You can also use Airflow's Public Interface via the :doc:`Stable REST API <stable-rest-api-ref>` (based on the
5050
OpenAPI specification). For specific needs you can also use the
51-
`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref>`_ though its behaviour might change
51+
:doc:`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref>` though its behaviour might change
5252
in details (such as output format and available flags) so if you want to rely on those in programmatic
5353
way, the Stable REST API is recommended.
5454

@@ -407,11 +407,12 @@ Everything not mentioned in this document should be considered as non-Public Int
407407
Sometimes in other applications those components could be relied on to keep backwards compatibility,
408408
but in Airflow they are not parts of the Public Interface and might change any time:
409409

410-
* `Database structure <database-erd-ref>`_ is considered to be an internal implementation
410+
* :doc:`Database structure <database-erd-ref>` is considered to be an internal implementation
411411
detail and you should not assume the structure is going to be maintained in a
412412
backwards-compatible way.
413413

414-
* `Web UI <ui>`_ is continuously evolving and there are no backwards compatibility guarantees on HTML elements.
414+
* :doc:`Web UI <ui>` is continuously evolving and there are no backwards
415+
compatibility guarantees on HTML elements.
415416

416417
* Python classes except those explicitly mentioned in this document, are considered an
417418
internal implementation detail and you should not assume they will be maintained

chart/docs/index.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -81,6 +81,8 @@ Features
8181
* Kerberos secure configuration
8282
* One-command deployment for any type of executor. You don't need to provide other services e.g. Redis/Database to test the Airflow.
8383

84+
.. _helm_chart_install:
85+
8486
Installing the Chart
8587
--------------------
8688

chart/docs/installing-helm-chart-from-sources.rst

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -16,17 +16,16 @@
1616
under the License.
1717
1818
Installing Helm Chart from sources
19-
----------------------------------
19+
==================================
2020

2121
Released packages
2222
'''''''''''''''''
2323

2424
.. jinja:: official_download_page
2525

2626
This page describes downloading and verifying ``Apache Airflow Official Helm Chart`` version
27-
``{{ package_version}}`` using officially released source packages. You can also install the chart
28-
directly from the ``airflow.apache.org`` repo as described in
29-
`Installing the chart <index#installing-the-chart>`_.
27+
``{{ package_version }}`` using officially released source packages. You can also install the chart
28+
directly from the ``airflow.apache.org`` repo as described in :ref:`helm_chart_install`.
3029
You can choose different version of the chart by selecting different version from the drop-down at
3130
the top-left of the page.
3231

providers/amazon/docs/operators/athena/index.rst

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,8 @@ Airflow offers two ways to query data using Amazon Athena.
3838
**Amazon Athena SQL (DB API Connection):** Opt for this if you need to execute multiple queries in the same operator and it's essential to retrieve and process query results directly in Airflow, such as for sensing values or further data manipulation.
3939

4040
.. note::
41-
Both connection methods uses `Amazon Web Services Connection <../../connections/aws>`_ under the hood for authentication.
41+
Both connection methods uses :doc:`Amazon Web Services Connection <../../connections/aws>`
42+
under the hood for authentication.
4243
You should decide which connection method to use based on your use case.
4344

4445
.. toctree::

0 commit comments

Comments
 (0)