-
Notifications
You must be signed in to change notification settings - Fork 16.6k
Description
Apache Airflow version: 1.10.14, apache-airflow-upgrade-check==1.1.0
Kubernetes version (if you are using kubernetes) (use kubectl version):
Environment:
- Cloud provider or hardware configuration: on-premise Linux server
- OS (e.g. from /etc/os-release): RHEL 7
- Kernel (e.g.
uname -a): - Install tools:
- Others:
What happened:
Running airflow upgrade_check gives a series of false positives for the import check rule. For example:
('Using `airflow.contrib.hooks.sftp_hook.SFTPHook` should be replaced by '
'`airflow.providers.sftp.hooks.sftp.SFTPHook`. Affected file: '
'/airflow/dags/std_wsljobs_dags/hccmodel_dag.py')
However, this import has actually already been updated in the indicated file, and when viewing the code using the "Code" button in the Airflow UI it shows the updated path.
What you expected to happen:
That this file would not be flagged by the ImportChangesRule.
How to reproduce it:
Run airflow upgrade_check with a DAG file containing one of the classes whose import path changed.
Anything else we need to know:
I believe the culprit is this section from the ImportChangesRule:
@staticmethod
def _check_file(file_path):
problems = []
providers = set()
with open(file_path, "r") as file:
content = file.read()
for change in ImportChangesRule.ALL_CHANGES:
if change.old_class in content:
problems.append(change.info(file_path))
if change.providers_package:
providers.add(change.providers_package)
return problems, providers
It is only checking the the old_class is present, not the old_path. The if statement should read: if change.old_path in content:. If I make this change in the source file, I receive zero problems back instead of 39.