Description
Today I was working on an issue in miss-islington. In particular, I was working on python/miss-islington#590. Per the dev guidance, I have the repo configured with a fork:
miss-islington main $ git remote -v
origin https://github.com/jaraco/miss-islington.git (fetch)
origin https://github.com/jaraco/miss-islington.git (push)
upstream https://github.com/python/miss-islington (fetch)
upstream https://github.com/jaraco/miss-islington.git (push)
I used the gh
tool to check out the pull request:
miss-islington main $ gh pr checkout 590
branch 'update-ci-py-versions' set up to track 'upstream/update-ci-py-versions'.
Switched to a new branch 'update-ci-py-versions'
I made a revision and then wished to push that revision to the branch from which I'd pulled it. However, because I've configured upstream
for jaraco/miss-islington (push)
, pushing the changes didn't go to the original branch but went to my fork instead:
miss-islington update-ci-py-versions $ git commit -a -m "Pin kombu on Python 3.12. Workaround for celery/kombu#1600"
[update-ci-py-versions 8a06689] Pin kombu on Python 3.12. Workaround for celery/kombu#1600
1 file changed, 3 insertions(+)
miss-islington update-ci-py-versions $ git push
Enumerating objects: 5, done.
Counting objects: 100% (5/5), done.
Delta compression using up to 8 threads
Compressing objects: 100% (3/3), done.
Writing objects: 100% (3/3), 1.03 KiB | 1.03 MiB/s, done.
Total 3 (delta 2), reused 0 (delta 0), pack-reused 0
remote: Resolving deltas: 100% (2/2), completed with 2 local objects.
To https://github.com/jaraco/miss-islington.git
c01dfc5..8a06689 update-ci-py-versions -> update-ci-py-versions
I guess that makes sense. A branch probably only tracks its "remote" and not the actual resource from which it was pulled.
As a result, I needed to manually push the changes to the actual upstream, which led to another problem:
miss-islington update-ci-py-versions $ git push gh://python/cpython
Enumerating objects: 1390, done.
Counting objects: 100% (1390/1390), done.
Delta compression using up to 8 threads
Compressing objects: 100% (499/499), done.
Writing objects: 100% (1390/1390), 318.04 KiB | 318.04 MiB/s, done.
Total 1390 (delta 865), reused 1384 (delta 862), pack-reused 0
remote: Resolving deltas: 100% (865/865), done.
remote:
remote: Create a pull request for 'update-ci-py-versions' on GitHub by visiting:
remote: https://github.com/python/cpython/pull/new/update-ci-py-versions
remote:
To https://github.com/python/cpython
* [new branch] update-ci-py-versions -> update-ci-py-versions
miss-islington update-ci-py-versions $ git push gh://python/cpython :update-ci-py-versions
To https://github.com/python/cpython
- [deleted] update-ci-py-versions
Because I needed to manually type the URL, I mistakenly used python/cpython
(out of habit) instead of the correct repo (python/miss-islington
), so I ended up pushing draft commits for miss-islington to cpython (oops). I hope the deletion effectively undid that change. I then pushed the changes to the intended target.
miss-islington update-ci-py-versions $ git push gh://python/miss-islington
Enumerating objects: 5, done.
Counting objects: 100% (5/5), done.
Delta compression using up to 8 threads
Compressing objects: 100% (3/3), done.
Writing objects: 100% (3/3), 1.03 KiB | 1.03 MiB/s, done.
Total 3 (delta 2), reused 0 (delta 0), pack-reused 0
remote: Resolving deltas: 100% (2/2), completed with 2 local objects.
To https://github.com/python/miss-islington
c01dfc5..8a06689 update-ci-py-versions -> update-ci-py-versions
That worked and triggered the CI run again.
What I wonder - is there a way to configure the fork workflow such that when one checks out a PR in the upstream repo that it's tracked properly?
I imagine one could configure upstream
to also push to the upstream repo, but that also increases the risk someone would accidentally push a branch to upstream.