Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@francescobrivio
Copy link
Contributor

@francescobrivio francescobrivio commented Jul 14, 2022

Replay Request

Requestor
Francesco Brivio for AlCaDB

Describe the configuration

  • Release: CMSSW_12_4_3
  • Run: 355189,355559
  • GTs:
    • expressGlobalTag: 124X_dataRun3_Express_v4
    • promptrecoGlobalTag: 124X_dataRun3_Prompt_v4
    • alcap0GlobalTag: 124X_dataRun3_Prompt_v4
  • Additional changes:
    • Update collision run to be replayed to 355559
      (~1h long, 300 bunches, all subsystems included, good rates for all streams, FullReco prescale used)
    • Switch to el8_amd64_gcc10 arch
    • Fix the name of the ALCAPPS scenario (AlCaPPS_Run3)
    • Moved the ALCAPPS Express config in the right place (together with the other Express configs)
    • Added the SiPixel HighGranularity PCL wf (PromptCalibProdSiPixelAliHG) already tested in Test new high granularity Tracker Alignment PCL #4704
    • Added the PPS Sampic Timing PCL wf (PromptCalibProdPPSDiamondSampic) which has been fixed in 12_4_3 ([124X] Rename PPS Diamond Sampic PCL path and task cms-sw/cmssw#38674)
    • Added DQM sequence @heavyFlavor to the DoubleMuonLowMass as suggested at the PPD Coordination meeting (see Tracking POG report)
    • Following @davidlange6's suggestion I cleaned up all the Repack and Express overrides previous to CMSSW_12_3_0

Purpose of the test
This replay is to test the updates of the Tier0 prod and replay configs in preparation for the switch to CMSSW_12_4_3.

T0 Operations cmsTalk thread
N/A

@francescobrivio
Copy link
Contributor Author

test syntax please

@francescobrivio
Copy link
Contributor Author

Of course we need to wait for CMSSW_12_4_3 before we can launch the replay 😄

@germanfgv
Copy link
Contributor

@francescobrivio Thank you for creating this.
Just a comment: Runs 353737 and 353739 were not saved so we don't have streamer files for them. Can you change them, please?

@francescobrivio
Copy link
Contributor Author

@francescobrivio Thank you for creating this. Just a comment: Runs 353737 and 353739 were not saved so we don't have streamer files for them. Can you change them, please?

Thanks for the comment German, I'll check which other runs are good.
Is this twiki https://twiki.cern.ch/twiki/bin/view/CMSPublic/CompOpsTier0TeamReplayGoodRuns up to date?

@francescobrivio
Copy link
Contributor Author

test syntax please

disk_node="T1_US_FNAL_Disk",
scenario=ppScenario)

DATASETS = ["ParkingDoubleMuonLowMass"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

isn't ParkingDoubleMuonLowMass swapped with ReservedDoubleMuonLowMass here according to what's in the gDoc of today's meeting:

Screenshot from 2022-07-15 15-45-19

scenario=ppScenario)

DATASETS = ["DoubleMuonLowMass"]
DATASETS = ["ReservedDoubleMuonLowMass"]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and also if this doesn't get prompt-recoed I guess you should remove the skims, alca_producers and dqm_sequences?

@francescobrivio
Copy link
Contributor Author

@jhonatanamado @germanfgv since the changes to the prod config are being implemented in #4717 should I revert all the changes in this PR and let only the updates to the replay config (so we can use this PR for the replay)?

@germanfgv
Copy link
Contributor

I started the replay with the configuration before Jhonatan changes. It can be monitored here:

https://monit-grafana.cern.ch/d/t_jr45h7k/cms-tier0-replayid-monitoring?orgId=11&var-Bin=5m&var-ReplayID=220716093354&var-JobType=All&var-WorkflowType=All&refresh=5s

@tvami
Copy link
Contributor

tvami commented Jul 18, 2022

hi @francescobrivio I understand #4717 is superseeding this PR, right? Can you please check #4717 and close this PR if everything is fine there? Thanks (as ORM :P )

@francescobrivio
Copy link
Contributor Author

Hi @tvami actually #4717 is not updating the replay config, only the prod one.
So maybe I can force-push again so that this PR only updates the replay config and can be merged (since the replay was successful)? Would that work for you?

@tvami
Copy link
Contributor

tvami commented Jul 18, 2022

Yes, good point, ok let's do this as the replay-config only update then. Thanks!
It currently also says "This branch has conflicts that must be resolved"

@francescobrivio
Copy link
Contributor Author

Ok I managed to clean up the changes in the prod config (which has been updated already in #4717) and only update the replay config.

I think this can be merged if you agree @tvami @jhonatanamado?

BTW once we have new data from today this config will have to be updated again to include the new PDs...

@jhonatanamado
Copy link
Contributor

Hi @francescobrivio. Yes this will be merged and then I will update the replay configuration with the new PDs.

@jhonatanamado jhonatanamado merged commit 9d6854a into dmwm:master Jul 18, 2022
@francescobrivio francescobrivio deleted the test_CMSSW_12_4_3 branch September 21, 2022 12:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants