Thanks to visit codestin.com
Credit goes to github.com

Skip to content

NAS BnR: Restore backed-up volume on live instances is not readable #10844

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 14, 2025

Conversation

abh1sar
Copy link
Collaborator

@abh1sar abh1sar commented May 10, 2025

Description

Issues:

  1. Restore backed-up volume on live instances attaches the qcow2 volume as a Raw image, which causes the instance to not
    be able to read it.
  2. In the cloudstack db the format of the restored volume is kept as Raw (for both live and stopped VMs). Although it doesn't cause any immediate effects, but it might cause some issue with other subsystems like volume migration.

Fix:

  1. virsh attach-disk command which is used to attach the restored volume to the live instance should include --subdriver qcow2 argument.
  2. NAS backup always converts the disk into Qcow2 format while backing up, so it's safe to set the disk format of the restored volume always to Qcow2 in the db.

Types of changes

  • Breaking change (fix or feature that would cause existing functionality to change)
  • New feature (non-breaking change which adds functionality)
  • Bug fix (non-breaking change which fixes an issue)
  • Enhancement (improves an existing feature and functionality)
  • Cleanup (Code refactoring and cleanup, that may add test cases)
  • build/CI
  • test (unit or integration test code)

Feature/Enhancement Scale or Bug Severity

Feature/Enhancement Scale

  • Major
  • Minor

Bug Severity

  • BLOCKER
  • Critical
  • Major
  • Minor
  • Trivial

Screenshots (if appropriate):

How Has This Been Tested?

  1. Create a VM with a data vol
  2. Install ext4 fs on the data vol, mount it and create a file
  3. Take NAS backup
  4. Create another instance and restore the data vol from the backup to the new instance
  5. Try to mount the data vol
    Before fix it throws the following error:
    Screenshot from 2025-05-09 22-18-23
    and domainxml shows the disk as raw
       <disk type='file' device='disk'>                                                                                                                                                                                                                                                                                    
      <driver name='qemu' type='raw' cache='none'/>                                                 

After fix these issues are resolved.

How did you try to break this feature and the system with this change?

Copy link

codecov bot commented May 10, 2025

Codecov Report

Attention: Patch coverage is 0% with 1 line in your changes missing coverage. Please review.

Project coverage is 16.13%. Comparing base (b8359e8) to head (5ad1a43).
Report is 120 commits behind head on 4.20.

Files with missing lines Patch % Lines
...rg/apache/cloudstack/backup/NASBackupProvider.java 0.00% 1 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff              @@
##               4.20   #10844      +/-   ##
============================================
+ Coverage     16.00%   16.13%   +0.12%     
- Complexity    13104    13217     +113     
============================================
  Files          5651     5651              
  Lines        495841   496759     +918     
  Branches      60044    60183     +139     
============================================
+ Hits          79365    80136     +771     
- Misses       407613   407699      +86     
- Partials       8863     8924      +61     
Flag Coverage Δ
uitests 4.00% <ø> (+<0.01%) ⬆️
unittests 16.98% <0.00%> (+0.13%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

Copy link
Contributor

@Pearl1594 Pearl1594 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

code lgtm

@abh1sar
Copy link
Collaborator Author

abh1sar commented May 12, 2025

@blueorangutan package

@blueorangutan
Copy link

@abh1sar a [SL] Jenkins job has been kicked to build packages. It will be bundled with KVM, XenServer and VMware SystemVM templates. I'll keep you posted as I make progress.

@Pearl1594 Pearl1594 moved this to In Progress in ACS 4.20.1 May 12, 2025
@Pearl1594 Pearl1594 added this to the 4.20.1 milestone May 12, 2025
@blueorangutan
Copy link

Packaging result [SF]: ✔️ el8 ✔️ el9 ✔️ debian ✔️ suse15. SL-JID 13336

@abh1sar
Copy link
Collaborator Author

abh1sar commented May 12, 2025

@blueorangutan test

@blueorangutan
Copy link

@abh1sar a [SL] Trillian-Jenkins test job (ol8 mgmt + kvm-ol8) has been kicked to run smoke tests

@blueorangutan
Copy link

[SF] Trillian test result (tid-13268)
Environment: kvm-ol8 (x2), Advanced Networking with Mgmt server ol8
Total time taken: 54714 seconds
Marvin logs: https://github.com/blueorangutan/acs-prs/releases/download/trillian/pr10844-t13268-kvm-ol8.zip
Smoke tests completed. 141 look OK, 0 have errors, 0 did not run
Only failed and skipped tests results shown below:

Test Result Time (s) Test File

@borisstoyanov borisstoyanov self-assigned this May 14, 2025
Copy link
Contributor

@borisstoyanov borisstoyanov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, managed to manually test the bug fix. I was able to restore the volume and mount and read the data on a new Instance

@borisstoyanov borisstoyanov removed their assignment May 14, 2025
@DaanHoogland DaanHoogland merged commit d55aa70 into apache:4.20 May 14, 2025
20 of 25 checks passed
@github-project-automation github-project-automation bot moved this from In Progress to Done in ACS 4.20.1 May 14, 2025
@DaanHoogland DaanHoogland deleted the restore-volume-nas branch May 14, 2025 13:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

5 participants