Inpaint4Drag: Repurposing Inpainting Models for Drag-Based Image Editing via Bidirectional Warping (ICCV 2025)
Jingyi Lu, Kai Han
Visual AI Lab, The University of Hong Kong
TLDR: Inpaint4Drag decomposes drag-based image editing into bidirectional warping (0.01s) and image inpainting (0.3s) (measured at 512×512). The method warps selected regions using bidirectional mapping, then fills revealed areas with inpainting models. Key advantages:
- Physics-inspired: Treats image regions as elastic materials for intuitive deformation
- Real-time interaction: Instant warping preview unlike existing methods requiring minutes per edit
- Universal adapter: Works with any inpainting model without modifications, inheriting future improvements
After cloning this repository, you can install the dependencies through the following steps:
# Create and activate environment
conda create -n inpaint4drag python=3.10 -y
conda activate inpaint4drag
# Install requirements
pip install torch torchvision
pip install -r requirements.txt
# Install EfficientViT-SAM (Optional)
pip install git+https://github.com/mit-han-lab/efficientvit.gitAfter installing the requirements, you can simply launch the user inferface through:
python app.py
You can evaluate our method by (1) downloading the DragBench dataset, (2) running evaluation.py on DragBench-DR and DragBench-SR. You may use utils/evaluator.py to evaluate other drag editing methods.
# Install gdown and download drag_data.zip from Google Drive
pip install gdown
gdown 1rdi4Rqka8zqHTbPyhQYtFC2UdWvAeAGV
# Extract and clean up
unzip -q drag_data.zip
rm -rf drag_data.zipEvaluate on DragBench-DR and DragBench-SR:
python evaluation.py --data_dir drag_data/dragbench-dr --output_dir output/dragbench-drpython evaluation.py --data_dir drag_data/dragbench-sr --output_dir output/dragbench-srFound this repo useful? We'd be grateful if you could give it a star ⭐ or cite our paper!
@inproceedings{lu2025inpaint4drag,
author = {Jingyi Lu and Kai Han},
title = {Inpaint4Drag: Repurposing Inpainting Models for Drag-Based Image Editing via Bidirectional Warping},
booktitle = {International Conference on Computer Vision (ICCV)},
year = {2025},
}
- Drag Your GAN: Interactive Point-based Manipulation on the Generative Image Manifold
- DragDiffusion: Harnessing Diffusion Models for Interactive Point-based Image Editing
- The Blessing of Randomness: SDE Beats ODE in General Diffusion-based Image Editing
- InstantDrag: Improving Interactivity in Drag-based Image Editing
- FastDrag: Manipulate Anything in One Step