Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Task failing with AioSession.__init__() got an unexpected keyword argument 'conn_id' with xcombackend #62691

@vatsrahul1001

Description

@vatsrahul1001

Apache Airflow version

main (development)

If "Other Airflow 3 version" selected, which one?

No response

What happened?

Getting below error when running any example DAG on main

2026-03-02 14:47:47] INFO - DAG bundles loaded: dags-folder, example_dags
[2026-03-02 14:47:47] INFO - Filling up the DagBag from /opt/airflow/airflow-core/src/airflow/example_dags/example_xcom.py
[2026-03-02 14:47:48] ERROR - Exception rendering Jinja template for task 'bash_push', field 'bash_command'. Template: 'echo "bash_push demo"  && echo "Manually set xcom value {{ ti.xcom_push(key="manually_pushed_value", value="manually_pushed_value") }}" && echo "value_by_return"'
TypeError: AioSession.__init__() got an unexpected keyword argument 'conn_id'
File "/opt/airflow/task-sdk/src/airflow/sdk/definitions/_internal/abstractoperator.py", line 325 in _do_render_template_fields

File "/opt/airflow/task-sdk/src/airflow/sdk/definitions/_internal/templater.py", line 185 in render_template

File "/opt/airflow/task-sdk/src/airflow/sdk/definitions/_internal/abstractoperator.py", line 286 in _render

File "/opt/airflow/task-sdk/src/airflow/sdk/definitions/_internal/templater.py", line 141 in _render

File "/opt/airflow/task-sdk/src/airflow/sdk/definitions/context.py", line 207 in render_template_to_string

File "/opt/airflow/task-sdk/src/airflow/sdk/definitions/context.py", line 197 in render_template

File "<template>", line 13 in root

File "/usr/python/lib/python3.10/site-packages/jinja2/sandbox.py", line 401 in call

File "/usr/python/lib/python3.10/site-packages/jinja2/runtime.py", line 303 in call

File "/opt/airflow/task-sdk/src/airflow/sdk/execution_time/task_runner.py", line 434 in xcom_push

File "/opt/airflow/task-sdk/src/airflow/sdk/execution_time/task_runner.py", line 676 in _xcom_push

File "/opt/airflow/task-sdk/src/airflow/sdk/bases/xcom.py", line 77 in set

File "/opt/airflow/providers/common/io/src/airflow/providers/common/io/xcom/backend.py", line 145 in serialize_value

File "/usr/python/lib/python3.10/site-packages/upath/extensions.py", line 245 in exists

File "/usr/python/lib/python3.10/site-packages/upath/core.py", line 1552 in exists

File "/usr/python/lib/python3.10/site-packages/fsspec/asyn.py", line 118 in wrapper

File "/usr/python/lib/python3.10/site-packages/fsspec/asyn.py", line 103 in sync

File "/usr/python/lib/python3.10/site-packages/fsspec/asyn.py", line 56 in _runner

File "/usr/python/lib/python3.10/site-packages/s3fs/core.py", line 1118 in _exists

File "/usr/python/lib/python3.10/site-packages/s3fs/core.py", line 1471 in _info

File "/usr/python/lib/python3.10/site-packages/s3fs/core.py", line 377 in _call_s3

File "/usr/python/lib/python3.10/site-packages/s3fs/core.py", line 560 in set_session

What you think should happen instead?

No response

How to reproduce

  1. start breeze on main
    breeze start-airflow --executor CeleryExecutor --backend postgres --load-example-dags
  2. set below env in init file in breeze
export AIRFLOW__CORE__XCOM_BACKEND=airflow.providers.common.io.xcom.backend.XComObjectStorageBackend
export AIRFLOW__COMMON_IO__XCOM_OBJECTSTORAGE_PATH=s3://aws_default@airflow-tutorial-data-rahul-new
export AIRFLOW__COMMON_IO__XCOM_OBJECTSTORAGE_THRESHOLD=0
  1. Try running any example dag

Operating System

Os

Versions of Apache Airflow Providers

No response

Deployment

Official Apache Airflow Helm Chart

Deployment details

No response

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions