Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 9 additions & 1 deletion RELEASE_NOTES.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@

.. towncrier release notes start

Airflow 3.0.3 (2025-07-08)
Airflow 3.0.3 (2025-07-14)
--------------------------

Significant Changes
Expand Down Expand Up @@ -80,6 +80,13 @@ Bug Fixes
- Fixing bad cadwyn migration for upstream map indexes (#52797)
- Run trigger expansion logic only when ``start_from_trigger`` is True (#52873)
- Fix example dag ``example_external_task_parent_deferrable.py`` imports (#52957)
- Fixes pagination in DAG run lists (#52989)
- Fix db downgrade check condition (#53005)
- Fix log viewing for skipped task (#53028,#53101)
- Fixes Grid view refresh after user actions (#53086)
- Fix ``no_status`` and ``duration`` for grid summaries (#53092)
- Fix ``ti.log_url`` not in Task Context (#50376)
- Fix XCom data deserialization when using ``XCom.get_all()`` method (#53102)

Miscellaneous
"""""""""""""
Expand All @@ -99,6 +106,7 @@ Doc Only Changes
- Add http-only warning when running behind proxy in documentation (#52699)
- Publish separate docs for Task SDK (#52682)
- Streamline Taskflow examples and link to core tutorial (#52709)
- Refresh Public Interface & align how-to guides for Airflow 3.0+ (#53011)

Airflow 3.0.2 (2025-06-10)
--------------------------
Expand Down
10 changes: 0 additions & 10 deletions airflow-core/docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,6 @@

PACKAGES_THAT_WE_SHOULD_ADD_TO_API_DOCS = {
"hooks",
"decorators",
"example_dags",
"executors",
"operators",
Expand All @@ -140,15 +139,6 @@

MODELS_THAT_SHOULD_BE_INCLUDED_IN_API_DOCS: set[str] = {
"baseoperator.py",
"connection.py",
"dag.py",
"dagrun.py",
"dagbag.py",
"param.py",
"taskinstance.py",
"taskinstancekey.py",
"variable.py",
"xcom.py",
}


Expand Down
2 changes: 1 addition & 1 deletion airflow-core/docs/core-concepts/params.rst
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ If the user-supplied values don't pass validation, Airflow shows a warning inste
DAG-level Params
----------------

To add Params to a :class:`~airflow.models.dag.DAG`, initialize it with the ``params`` kwarg.
To add Params to a :class:`~airflow.sdk.DAG`, initialize it with the ``params`` kwarg.
Use a dictionary that maps Param names to either a :class:`~airflow.sdk.definitions.param.Param` or an object indicating the parameter's default value.

.. code-block::
Expand Down
14 changes: 14 additions & 0 deletions airflow-core/docs/core-concepts/variables.rst
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,20 @@ To use them, just import and call ``get`` on the Variable model::
# Returns the value of default (None) if the variable is not set
baz = Variable.get("baz", default=None)

You can also access variables through the Task Context using
:func:`~airflow.sdk.get_current_context`:

.. code-block:: python
from airflow.sdk import get_current_context
def my_task():
context = get_current_context()
var = context["var"]
my_variable = var.get("my_variable_name")
return my_variable
You can also use them from :ref:`templates <concepts:jinja-templating>`::

# Raw value
Expand Down
9 changes: 5 additions & 4 deletions airflow-core/docs/core-concepts/xcoms.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,9 @@ XComs (short for "cross-communications") are a mechanism that let :doc:`tasks` t

An XCom is identified by a ``key`` (essentially its name), as well as the ``task_id`` and ``dag_id`` it came from. They can have any serializable value (including objects that are decorated with ``@dataclass`` or ``@attr.define``, see :ref:`TaskFlow arguments <concepts:arbitrary-arguments>`:), but they are only designed for small amounts of data; do not use them to pass around large values, like dataframes.

XCom operations should be performed through the Task Context using
:func:`~airflow.sdk.get_current_context`. Directly updating using XCom database model is not possible.

XComs are explicitly "pushed" and "pulled" to/from their storage using the ``xcom_push`` and ``xcom_pull`` methods on Task Instances.

To push a value within a task called **"task-1"** that will be used by another task:
Expand Down Expand Up @@ -73,8 +76,6 @@ An example of pushing multiple XComs and pulling them individually:
# Pulling entire xcom data from push_multiple task
data = context["ti"].xcom_pull(task_ids="push_multiple", key="return_value")
.. note::

If the first task run is not succeeded then on every retry task XComs will be cleared to make the task run idempotent.
Expand All @@ -91,7 +92,7 @@ Custom XCom Backends

The XCom system has interchangeable backends, and you can set which backend is being used via the ``xcom_backend`` configuration option.

If you want to implement your own backend, you should subclass :class:`~airflow.models.xcom.BaseXCom`, and override the ``serialize_value`` and ``deserialize_value`` methods.
If you want to implement your own backend, you should subclass :class:`~airflow.sdk.bases.xcom.BaseXCom`, and override the ``serialize_value`` and ``deserialize_value`` methods.

You can override the ``purge`` method in the ``BaseXCom`` class to have control over purging the xcom data from the custom backend. This will be called as part of ``delete``.

Expand All @@ -104,6 +105,6 @@ If you can exec into a terminal in an Airflow container, you can then print out

.. code-block:: python
from airflow.models.xcom import XCom
from airflow.sdk.execution_time.xcom import XCom
print(XCom.__name__)
6 changes: 3 additions & 3 deletions airflow-core/docs/howto/connection.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Managing Connections

For an overview of hooks and connections, see :doc:`/authoring-and-scheduling/connections`.

Airflow's :class:`~airflow.models.connection.Connection` object is used for storing credentials and other information necessary for connecting to external services.
Airflow's :class:`~airflow.sdk.Connection` object is used for storing credentials and other information necessary for connecting to external services.

Connections may be defined in the following ways:

Expand Down Expand Up @@ -77,7 +77,7 @@ convenience property :py:meth:`~airflow.models.connection.Connection.as_json`. I

.. code-block:: pycon
>>> from airflow.models.connection import Connection
>>> from airflow.sdk import Connection
>>> c = Connection(
... conn_id="some_conn",
... conn_type="mysql",
Expand All @@ -94,7 +94,7 @@ In addition, same approach could be used to convert Connection from URI format t

.. code-block:: pycon
>>> from airflow.models.connection import Connection
>>> from airflow.sdk import Connection
>>> c = Connection(
... conn_id="awesome_conn",
... description="Example Connection",
Expand Down
2 changes: 1 addition & 1 deletion airflow-core/docs/howto/custom-operator.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Creating a custom Operator
Airflow allows you to create new operators to suit the requirements of you or your team.
This extensibility is one of the many features which make Apache Airflow powerful.

You can create any operator you want by extending the :class:`airflow.models.baseoperator.BaseOperator`
You can create any operator you want by extending the public SDK base class :class:`~airflow.sdk.BaseOperator`.

There are two methods that you need to override in a derived class:

Expand Down
2 changes: 1 addition & 1 deletion airflow-core/docs/howto/docker-compose/docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@ services:
echo "For other operating systems you can get rid of the warning with manually created .env file:"
echo " See: https://airflow.apache.org/docs/apache-airflow/stable/howto/docker-compose/index.html#setting-the-right-airflow-user"
echo
export AIRFLOW_UID=$(id -u)
export AIRFLOW_UID=$$(id -u)
fi
one_meg=1048576
mem_available=$$(($$(getconf _PHYS_PAGES) * $$(getconf PAGE_SIZE) / one_meg))
Expand Down
Loading